Oct 06 12:08:32 crc systemd[1]: Starting Kubernetes Kubelet... Oct 06 12:08:32 crc restorecon[4669]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 12:08:32 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 06 12:08:33 crc restorecon[4669]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 06 12:08:33 crc kubenswrapper[4892]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 12:08:33 crc kubenswrapper[4892]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 06 12:08:33 crc kubenswrapper[4892]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 12:08:33 crc kubenswrapper[4892]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 12:08:33 crc kubenswrapper[4892]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 06 12:08:33 crc kubenswrapper[4892]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.911008 4892 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.916899 4892 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.916935 4892 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.916945 4892 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.916994 4892 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917007 4892 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917018 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917029 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917039 4892 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917049 4892 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917057 4892 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917065 4892 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917073 4892 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917083 4892 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917090 4892 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917098 4892 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917106 4892 feature_gate.go:330] unrecognized feature gate: Example Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917122 4892 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917130 4892 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917138 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917146 4892 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917154 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917161 4892 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917171 4892 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917182 4892 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917191 4892 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917199 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917206 4892 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917215 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917223 4892 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917233 4892 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917241 4892 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917249 4892 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917257 4892 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917266 4892 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917275 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917283 4892 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917291 4892 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917300 4892 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917308 4892 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917316 4892 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917356 4892 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917364 4892 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917372 4892 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917380 4892 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917388 4892 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917400 4892 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917409 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917417 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917425 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917433 4892 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917441 4892 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917449 4892 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917458 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917465 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917475 4892 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917485 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917495 4892 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917506 4892 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917516 4892 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917527 4892 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917537 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917548 4892 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917559 4892 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917568 4892 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917585 4892 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917594 4892 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917604 4892 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917614 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917627 4892 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917637 4892 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.917645 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917792 4892 flags.go:64] FLAG: --address="0.0.0.0" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917809 4892 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917822 4892 flags.go:64] FLAG: --anonymous-auth="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917833 4892 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917845 4892 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917854 4892 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917865 4892 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917876 4892 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917885 4892 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917895 4892 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917905 4892 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917914 4892 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917923 4892 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917932 4892 flags.go:64] FLAG: --cgroup-root="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917941 4892 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917950 4892 flags.go:64] FLAG: --client-ca-file="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917959 4892 flags.go:64] FLAG: --cloud-config="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917970 4892 flags.go:64] FLAG: --cloud-provider="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917979 4892 flags.go:64] FLAG: --cluster-dns="[]" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.917991 4892 flags.go:64] FLAG: --cluster-domain="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918000 4892 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918009 4892 flags.go:64] FLAG: --config-dir="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918018 4892 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918027 4892 flags.go:64] FLAG: --container-log-max-files="5" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918038 4892 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918047 4892 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918056 4892 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918065 4892 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918074 4892 flags.go:64] FLAG: --contention-profiling="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918083 4892 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918091 4892 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918101 4892 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918109 4892 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918120 4892 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918129 4892 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918138 4892 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918147 4892 flags.go:64] FLAG: --enable-load-reader="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918156 4892 flags.go:64] FLAG: --enable-server="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918165 4892 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918176 4892 flags.go:64] FLAG: --event-burst="100" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918185 4892 flags.go:64] FLAG: --event-qps="50" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918194 4892 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918203 4892 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918212 4892 flags.go:64] FLAG: --eviction-hard="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918222 4892 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918231 4892 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918240 4892 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918250 4892 flags.go:64] FLAG: --eviction-soft="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918259 4892 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918267 4892 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918276 4892 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918286 4892 flags.go:64] FLAG: --experimental-mounter-path="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918294 4892 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918304 4892 flags.go:64] FLAG: --fail-swap-on="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918312 4892 flags.go:64] FLAG: --feature-gates="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918357 4892 flags.go:64] FLAG: --file-check-frequency="20s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918367 4892 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918376 4892 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918385 4892 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918394 4892 flags.go:64] FLAG: --healthz-port="10248" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918403 4892 flags.go:64] FLAG: --help="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918413 4892 flags.go:64] FLAG: --hostname-override="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918422 4892 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918431 4892 flags.go:64] FLAG: --http-check-frequency="20s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918440 4892 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918449 4892 flags.go:64] FLAG: --image-credential-provider-config="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918458 4892 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918467 4892 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918476 4892 flags.go:64] FLAG: --image-service-endpoint="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918484 4892 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918494 4892 flags.go:64] FLAG: --kube-api-burst="100" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918503 4892 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918514 4892 flags.go:64] FLAG: --kube-api-qps="50" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918524 4892 flags.go:64] FLAG: --kube-reserved="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918533 4892 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918541 4892 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918551 4892 flags.go:64] FLAG: --kubelet-cgroups="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918560 4892 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918569 4892 flags.go:64] FLAG: --lock-file="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918579 4892 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918588 4892 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918597 4892 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918610 4892 flags.go:64] FLAG: --log-json-split-stream="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918619 4892 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918628 4892 flags.go:64] FLAG: --log-text-split-stream="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918637 4892 flags.go:64] FLAG: --logging-format="text" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918645 4892 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918655 4892 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918664 4892 flags.go:64] FLAG: --manifest-url="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.918741 4892 flags.go:64] FLAG: --manifest-url-header="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919056 4892 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919067 4892 flags.go:64] FLAG: --max-open-files="1000000" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919080 4892 flags.go:64] FLAG: --max-pods="110" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919091 4892 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919102 4892 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919112 4892 flags.go:64] FLAG: --memory-manager-policy="None" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919123 4892 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919135 4892 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919153 4892 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919163 4892 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919186 4892 flags.go:64] FLAG: --node-status-max-images="50" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919196 4892 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919206 4892 flags.go:64] FLAG: --oom-score-adj="-999" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919216 4892 flags.go:64] FLAG: --pod-cidr="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919224 4892 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919245 4892 flags.go:64] FLAG: --pod-manifest-path="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919255 4892 flags.go:64] FLAG: --pod-max-pids="-1" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919264 4892 flags.go:64] FLAG: --pods-per-core="0" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919276 4892 flags.go:64] FLAG: --port="10250" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919287 4892 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919296 4892 flags.go:64] FLAG: --provider-id="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919306 4892 flags.go:64] FLAG: --qos-reserved="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919316 4892 flags.go:64] FLAG: --read-only-port="10255" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919355 4892 flags.go:64] FLAG: --register-node="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919365 4892 flags.go:64] FLAG: --register-schedulable="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919375 4892 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919392 4892 flags.go:64] FLAG: --registry-burst="10" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919401 4892 flags.go:64] FLAG: --registry-qps="5" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919412 4892 flags.go:64] FLAG: --reserved-cpus="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919424 4892 flags.go:64] FLAG: --reserved-memory="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919440 4892 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919477 4892 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.919491 4892 flags.go:64] FLAG: --rotate-certificates="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920299 4892 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920389 4892 flags.go:64] FLAG: --runonce="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920427 4892 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920441 4892 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920456 4892 flags.go:64] FLAG: --seccomp-default="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920471 4892 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920522 4892 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920897 4892 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920928 4892 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920942 4892 flags.go:64] FLAG: --storage-driver-password="root" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.920988 4892 flags.go:64] FLAG: --storage-driver-secure="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921002 4892 flags.go:64] FLAG: --storage-driver-table="stats" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921015 4892 flags.go:64] FLAG: --storage-driver-user="root" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921028 4892 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921044 4892 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921057 4892 flags.go:64] FLAG: --system-cgroups="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921072 4892 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921111 4892 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921124 4892 flags.go:64] FLAG: --tls-cert-file="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921136 4892 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921154 4892 flags.go:64] FLAG: --tls-min-version="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921166 4892 flags.go:64] FLAG: --tls-private-key-file="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921177 4892 flags.go:64] FLAG: --topology-manager-policy="none" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921188 4892 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921200 4892 flags.go:64] FLAG: --topology-manager-scope="container" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921211 4892 flags.go:64] FLAG: --v="2" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921245 4892 flags.go:64] FLAG: --version="false" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921260 4892 flags.go:64] FLAG: --vmodule="" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921274 4892 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.921287 4892 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921615 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921628 4892 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921639 4892 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921652 4892 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921663 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921675 4892 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921688 4892 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921698 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921710 4892 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921719 4892 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921729 4892 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921738 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921748 4892 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921761 4892 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921774 4892 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921786 4892 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921797 4892 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921808 4892 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921818 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921829 4892 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921838 4892 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921848 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921857 4892 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921867 4892 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921876 4892 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921885 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921895 4892 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921904 4892 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921914 4892 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921925 4892 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921934 4892 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921944 4892 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921954 4892 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921964 4892 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921974 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921983 4892 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.921993 4892 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922004 4892 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922015 4892 feature_gate.go:330] unrecognized feature gate: Example Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922025 4892 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922035 4892 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922045 4892 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922055 4892 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922065 4892 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922074 4892 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922084 4892 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922093 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922102 4892 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922112 4892 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922122 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922131 4892 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922143 4892 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922155 4892 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922165 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922175 4892 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922188 4892 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922200 4892 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922212 4892 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922223 4892 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922236 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922246 4892 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922255 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922264 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922274 4892 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922283 4892 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922293 4892 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922303 4892 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922313 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922349 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922361 4892 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.922371 4892 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.922405 4892 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.938103 4892 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.938164 4892 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938283 4892 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938294 4892 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938301 4892 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938306 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938312 4892 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938317 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938340 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938346 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938351 4892 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938356 4892 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938361 4892 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938366 4892 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938372 4892 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938377 4892 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938382 4892 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938387 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938392 4892 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938397 4892 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938402 4892 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938407 4892 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938413 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938417 4892 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938422 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938428 4892 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938433 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938438 4892 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938444 4892 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938448 4892 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938453 4892 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938458 4892 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938464 4892 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938469 4892 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938475 4892 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938481 4892 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938488 4892 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938492 4892 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938500 4892 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938510 4892 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938517 4892 feature_gate.go:330] unrecognized feature gate: Example Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938523 4892 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938529 4892 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938535 4892 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938542 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938548 4892 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938553 4892 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938558 4892 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938564 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938570 4892 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938577 4892 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938584 4892 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938589 4892 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938595 4892 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938600 4892 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938605 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938612 4892 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938618 4892 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938624 4892 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938629 4892 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938635 4892 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938641 4892 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938647 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938652 4892 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938657 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938662 4892 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938668 4892 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938675 4892 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938682 4892 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938688 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938695 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938701 4892 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938710 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.938721 4892 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938937 4892 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938949 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938957 4892 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938964 4892 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938971 4892 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938976 4892 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938982 4892 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938987 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938993 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.938998 4892 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939003 4892 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939008 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939014 4892 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939019 4892 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939024 4892 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939029 4892 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939035 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939040 4892 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939047 4892 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939052 4892 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939057 4892 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939062 4892 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939067 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939073 4892 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939080 4892 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939086 4892 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939094 4892 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939100 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939107 4892 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939113 4892 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939119 4892 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939126 4892 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939131 4892 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939137 4892 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939143 4892 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939148 4892 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939153 4892 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939158 4892 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939163 4892 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939168 4892 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939173 4892 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939178 4892 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939183 4892 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939188 4892 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939193 4892 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939198 4892 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939204 4892 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939209 4892 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939214 4892 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939221 4892 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939227 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939233 4892 feature_gate.go:330] unrecognized feature gate: Example Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939238 4892 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939244 4892 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939249 4892 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939254 4892 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939259 4892 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939267 4892 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939273 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939278 4892 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939284 4892 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939290 4892 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939295 4892 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939301 4892 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939307 4892 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939313 4892 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939319 4892 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939349 4892 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939355 4892 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939361 4892 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 06 12:08:33 crc kubenswrapper[4892]: W1006 12:08:33.939368 4892 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.939375 4892 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.940634 4892 server.go:940] "Client rotation is on, will bootstrap in background" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.946285 4892 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.946412 4892 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.948404 4892 server.go:997] "Starting client certificate rotation" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.948444 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.948667 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-06 20:13:45.292120168 +0000 UTC Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.948810 4892 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 2216h5m11.343315284s for next certificate rotation Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.974838 4892 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.977281 4892 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 12:08:33 crc kubenswrapper[4892]: I1006 12:08:33.994259 4892 log.go:25] "Validated CRI v1 runtime API" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.033303 4892 log.go:25] "Validated CRI v1 image API" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.035778 4892 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.041736 4892 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-06-12-03-04-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.041813 4892 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.072824 4892 manager.go:217] Machine: {Timestamp:2025-10-06 12:08:34.066835797 +0000 UTC m=+0.616541632 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2d0b290c-b340-4076-a23d-1a9b47beb5f4 BootID:f7bf1197-2aff-4edc-bce6-57187119027c Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:13:ed:a7 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:13:ed:a7 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d9:0c:b3 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:e0:4d:c9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:25:2f:74 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:fe:e5:62 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9a:3c:38:c9:fa:f9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:9e:2e:50:39:3c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.073261 4892 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.073483 4892 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.074139 4892 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.074472 4892 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.074529 4892 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.074837 4892 topology_manager.go:138] "Creating topology manager with none policy" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.074852 4892 container_manager_linux.go:303] "Creating device plugin manager" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.075487 4892 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.075534 4892 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.075861 4892 state_mem.go:36] "Initialized new in-memory state store" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.075977 4892 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.080156 4892 kubelet.go:418] "Attempting to sync node with API server" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.080185 4892 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.080206 4892 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.080224 4892 kubelet.go:324] "Adding apiserver pod source" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.080239 4892 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.086036 4892 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.088055 4892 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.089826 4892 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.091124 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.091139 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.091289 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.091358 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.091912 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.091964 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.091979 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.091993 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.092018 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.092032 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.092046 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.092070 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.092087 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.092103 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.092122 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.092135 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.094022 4892 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.094738 4892 server.go:1280] "Started kubelet" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.094906 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:34 crc systemd[1]: Started Kubernetes Kubelet. Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.097337 4892 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.097371 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.097315 4892 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.097428 4892 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.097620 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 13:32:22.603549684 +0000 UTC Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.097694 4892 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2449h23m48.505861565s for next certificate rotation Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.106290 4892 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.106312 4892 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.106469 4892 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.106477 4892 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.106722 4892 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.111755 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="200ms" Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.112535 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.112675 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.113120 4892 server.go:460] "Adding debug handlers to kubelet server" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.113181 4892 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.113208 4892 factory.go:55] Registering systemd factory Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.113229 4892 factory.go:221] Registration of the systemd container factory successfully Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.112062 4892 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.144:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186be596873705d9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-06 12:08:34.094695897 +0000 UTC m=+0.644401702,LastTimestamp:2025-10-06 12:08:34.094695897 +0000 UTC m=+0.644401702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.113836 4892 factory.go:153] Registering CRI-O factory Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.113937 4892 factory.go:221] Registration of the crio container factory successfully Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.114026 4892 factory.go:103] Registering Raw factory Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.114109 4892 manager.go:1196] Started watching for new ooms in manager Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.114817 4892 manager.go:319] Starting recovery of all containers Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.126786 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.126878 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.126925 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.126955 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.126982 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127022 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127052 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127104 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127154 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127194 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127394 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127436 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127480 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127527 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127634 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127675 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127713 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127740 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127777 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127867 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127908 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127945 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.127974 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128077 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128109 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128143 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128299 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128405 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128451 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128490 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128527 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128550 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128582 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128606 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128626 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128656 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128679 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128709 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128731 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128751 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128777 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128800 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128830 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128892 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128919 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128947 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128969 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.128988 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129014 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129035 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129091 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129118 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129157 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129187 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129211 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129258 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129290 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129362 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129392 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129428 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129450 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129474 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129508 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129529 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129564 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129585 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129606 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129765 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129796 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129826 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129848 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129868 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129894 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129921 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129968 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.129990 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.130011 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.133772 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.133877 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.133915 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.133980 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134032 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134116 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134165 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134196 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134244 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134288 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134418 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134465 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134492 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134556 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134597 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134650 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134697 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134730 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134780 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134828 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134865 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134899 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134928 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134964 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.134998 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.135035 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.135077 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.135173 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.135230 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.135302 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136251 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136334 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136357 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136376 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136392 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136409 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136426 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136441 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136457 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136470 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136483 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136525 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136541 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136555 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136568 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136581 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136594 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136607 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136622 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136636 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136651 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136664 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136680 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136694 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136708 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136721 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136734 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136748 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136763 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136779 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136795 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136846 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136869 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136887 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136902 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136922 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136940 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136956 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136971 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136986 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.136999 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137052 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137094 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137111 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137126 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137143 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137158 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137173 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137187 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137202 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137217 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137231 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137246 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137261 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137274 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137290 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137304 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137351 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137367 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137381 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.137396 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139568 4892 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139605 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139623 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139640 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139658 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139675 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139689 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139704 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139720 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139733 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139747 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139762 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139699 4892 manager.go:324] Recovery completed Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.139776 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140424 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140447 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140463 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140477 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140492 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140507 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140528 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140550 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140564 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140579 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140594 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140607 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140623 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140637 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140651 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140665 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140678 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140691 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140705 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140717 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140730 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140744 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140757 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140775 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140787 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140800 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140814 4892 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140828 4892 reconstruct.go:97] "Volume reconstruction finished" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.140837 4892 reconciler.go:26] "Reconciler: start to sync state" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.149151 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.150939 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.150997 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.151017 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.152545 4892 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.152697 4892 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.152820 4892 state_mem.go:36] "Initialized new in-memory state store" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.164729 4892 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.167240 4892 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.167299 4892 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.167348 4892 kubelet.go:2335] "Starting kubelet main sync loop" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.167488 4892 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.167751 4892 policy_none.go:49] "None policy: Start" Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.168099 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.168151 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.168816 4892 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.168851 4892 state_mem.go:35] "Initializing new in-memory state store" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.207066 4892 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.247395 4892 manager.go:334] "Starting Device Plugin manager" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.247708 4892 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.247733 4892 server.go:79] "Starting device plugin registration server" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.248312 4892 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.248398 4892 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.248633 4892 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.248732 4892 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.248742 4892 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.258898 4892 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.268318 4892 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.268443 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.269886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.269940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.269961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.270139 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.270313 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.270396 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.271431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.271478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.271501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.272962 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.273015 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.273064 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.273248 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.273506 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.273583 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.274911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.274957 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.274977 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.275125 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.275249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.275301 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.275319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.275372 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.275421 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276162 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276397 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276434 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276447 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276531 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276572 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.276411 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.277988 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.278014 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.278028 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.278191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.278207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.278219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.278469 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.278513 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.279635 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.279666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.279679 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.313440 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="400ms" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343168 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343202 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343220 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343237 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343254 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343270 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343286 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343428 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343635 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343668 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343691 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343714 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343742 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343761 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.343785 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.348898 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.350530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.350581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.350591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.350618 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.351125 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.445697 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.445777 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.445817 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.445846 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.445879 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.445909 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.445939 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.445968 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446002 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446030 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446047 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446090 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446176 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446051 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446201 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446243 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446256 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446252 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446257 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446211 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446478 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446309 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446177 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446548 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446559 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446625 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446673 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.446709 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.551980 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.553956 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.553990 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.553998 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.554023 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.554589 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.598417 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.620169 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.641906 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.660916 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9cc0be0ee8bee44ac5a4ace5bcc1178f637a3b1c2c77f65134e9f0ff0a90bf1a WatchSource:0}: Error finding container 9cc0be0ee8bee44ac5a4ace5bcc1178f637a3b1c2c77f65134e9f0ff0a90bf1a: Status 404 returned error can't find the container with id 9cc0be0ee8bee44ac5a4ace5bcc1178f637a3b1c2c77f65134e9f0ff0a90bf1a Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.663272 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d8aa12c639ee3803b6251e27a8bdc9157ddcd4ae493215cb70e8c8224c9f8b14 WatchSource:0}: Error finding container d8aa12c639ee3803b6251e27a8bdc9157ddcd4ae493215cb70e8c8224c9f8b14: Status 404 returned error can't find the container with id d8aa12c639ee3803b6251e27a8bdc9157ddcd4ae493215cb70e8c8224c9f8b14 Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.668720 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.671229 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2f8e2fcb36166ee37d2ac9cb48df1e7e94b3fa095cf5996c86d91902d0250a74 WatchSource:0}: Error finding container 2f8e2fcb36166ee37d2ac9cb48df1e7e94b3fa095cf5996c86d91902d0250a74: Status 404 returned error can't find the container with id 2f8e2fcb36166ee37d2ac9cb48df1e7e94b3fa095cf5996c86d91902d0250a74 Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.678554 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.689426 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-53f7d4f0044c1d319e84818d1fc1da0091b107cd239f3297488c7f1ef48896bb WatchSource:0}: Error finding container 53f7d4f0044c1d319e84818d1fc1da0091b107cd239f3297488c7f1ef48896bb: Status 404 returned error can't find the container with id 53f7d4f0044c1d319e84818d1fc1da0091b107cd239f3297488c7f1ef48896bb Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.712554 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-57186755d065b224f4dcb99aa601aa884e775ca8e25d3e2771457a8bef5beded WatchSource:0}: Error finding container 57186755d065b224f4dcb99aa601aa884e775ca8e25d3e2771457a8bef5beded: Status 404 returned error can't find the container with id 57186755d065b224f4dcb99aa601aa884e775ca8e25d3e2771457a8bef5beded Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.714213 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="800ms" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.955760 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.957693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.957755 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.957775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:34 crc kubenswrapper[4892]: I1006 12:08:34.957812 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.958449 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Oct 06 12:08:34 crc kubenswrapper[4892]: W1006 12:08:34.984868 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:34 crc kubenswrapper[4892]: E1006 12:08:34.984975 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.095968 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.172199 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9cc0be0ee8bee44ac5a4ace5bcc1178f637a3b1c2c77f65134e9f0ff0a90bf1a"} Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.173827 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d8aa12c639ee3803b6251e27a8bdc9157ddcd4ae493215cb70e8c8224c9f8b14"} Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.175612 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"57186755d065b224f4dcb99aa601aa884e775ca8e25d3e2771457a8bef5beded"} Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.177450 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"53f7d4f0044c1d319e84818d1fc1da0091b107cd239f3297488c7f1ef48896bb"} Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.178918 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2f8e2fcb36166ee37d2ac9cb48df1e7e94b3fa095cf5996c86d91902d0250a74"} Oct 06 12:08:35 crc kubenswrapper[4892]: W1006 12:08:35.374430 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:35 crc kubenswrapper[4892]: E1006 12:08:35.374896 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:35 crc kubenswrapper[4892]: E1006 12:08:35.515601 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="1.6s" Oct 06 12:08:35 crc kubenswrapper[4892]: W1006 12:08:35.615530 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:35 crc kubenswrapper[4892]: E1006 12:08:35.615672 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:35 crc kubenswrapper[4892]: W1006 12:08:35.669041 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:35 crc kubenswrapper[4892]: E1006 12:08:35.669174 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.759003 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.761153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.761220 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.761234 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:35 crc kubenswrapper[4892]: I1006 12:08:35.761264 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 12:08:35 crc kubenswrapper[4892]: E1006 12:08:35.762006 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.096250 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.186827 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b"} Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.186904 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897"} Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.186915 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.186926 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173"} Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.187091 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569"} Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.188650 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.188716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.188738 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.189434 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505" exitCode=0 Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.189686 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.189717 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505"} Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.191109 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.191155 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.191174 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.192591 4892 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a" exitCode=0 Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.192725 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.192749 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a"} Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.193796 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.194073 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.194113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.194158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.195068 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.195142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.195172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.195470 4892 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb" exitCode=0 Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.195574 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb"} Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.195633 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.197054 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.197097 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.197115 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.198819 4892 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c" exitCode=0 Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.198859 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c"} Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.198950 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.200440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.200506 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:36 crc kubenswrapper[4892]: I1006 12:08:36.200529 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.096575 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:37 crc kubenswrapper[4892]: E1006 12:08:37.116823 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.144:6443: connect: connection refused" interval="3.2s" Oct 06 12:08:37 crc kubenswrapper[4892]: W1006 12:08:37.140604 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:37 crc kubenswrapper[4892]: E1006 12:08:37.140727 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.204366 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85"} Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.204429 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273"} Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.204446 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5"} Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.207032 4892 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d" exitCode=0 Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.207097 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d"} Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.207231 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.208244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.208273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.208284 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.209796 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"023bdeb58d3b704a5c9ebed84a02077257e3fc1a3de16fe138ca95e7c2bbae42"} Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.209914 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.212006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.212055 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.212076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.213293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7"} Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.213391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25"} Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.213421 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82"} Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.213428 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.213316 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.214555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.214622 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.214649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.215755 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.215798 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.215815 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:37 crc kubenswrapper[4892]: W1006 12:08:37.283195 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:37 crc kubenswrapper[4892]: E1006 12:08:37.283311 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.363183 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.364295 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.364352 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.364363 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:37 crc kubenswrapper[4892]: I1006 12:08:37.364391 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 12:08:37 crc kubenswrapper[4892]: E1006 12:08:37.364891 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.144:6443: connect: connection refused" node="crc" Oct 06 12:08:37 crc kubenswrapper[4892]: W1006 12:08:37.861360 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.144:6443: connect: connection refused Oct 06 12:08:37 crc kubenswrapper[4892]: E1006 12:08:37.861443 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.144:6443: connect: connection refused" logger="UnhandledError" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.220490 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"29a9efdb9482e02ce084c5d662ca8b4cb7d378257b890ac0cc069689a5a46ff4"} Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.220574 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24"} Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.220604 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.222230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.222284 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.222305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.224194 4892 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04" exitCode=0 Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.224399 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.225431 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04"} Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.225488 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.225538 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.225566 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.225747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.225776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.225792 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.232940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.232982 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.232991 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.233070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.233415 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.233476 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.464455 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.464627 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.465980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.466031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.466048 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:38 crc kubenswrapper[4892]: I1006 12:08:38.471605 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.003474 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.231394 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.231412 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f"} Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.232182 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a"} Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.232214 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e"} Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.231481 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.231467 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.232296 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234169 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234190 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234193 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234284 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234200 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234353 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.234220 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:39 crc kubenswrapper[4892]: I1006 12:08:39.343237 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.241104 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.241542 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4"} Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.241617 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b"} Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.241652 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.242653 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.242699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.242717 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.243113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.243173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.243193 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.565642 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.567272 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.567375 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.567395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.567431 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.606917 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.607074 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.607123 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.608675 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.608737 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:40 crc kubenswrapper[4892]: I1006 12:08:40.608758 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:41 crc kubenswrapper[4892]: I1006 12:08:41.244946 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:41 crc kubenswrapper[4892]: I1006 12:08:41.244973 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:41 crc kubenswrapper[4892]: I1006 12:08:41.247279 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:41 crc kubenswrapper[4892]: I1006 12:08:41.247367 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:41 crc kubenswrapper[4892]: I1006 12:08:41.247389 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:41 crc kubenswrapper[4892]: I1006 12:08:41.247412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:41 crc kubenswrapper[4892]: I1006 12:08:41.247451 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:41 crc kubenswrapper[4892]: I1006 12:08:41.247417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.030509 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.030798 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.032490 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.032551 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.032568 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.817037 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.817522 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.819403 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.819464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:42 crc kubenswrapper[4892]: I1006 12:08:42.819482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.067030 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.067264 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.069026 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.069099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.069122 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.112641 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.112833 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.114182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.114224 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.114241 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.543982 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.544284 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.546072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.546149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:43 crc kubenswrapper[4892]: I1006 12:08:43.546168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:44 crc kubenswrapper[4892]: E1006 12:08:44.259478 4892 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 06 12:08:45 crc kubenswrapper[4892]: I1006 12:08:45.818035 4892 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 06 12:08:45 crc kubenswrapper[4892]: I1006 12:08:45.818149 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.096095 4892 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.269757 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.272117 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="29a9efdb9482e02ce084c5d662ca8b4cb7d378257b890ac0cc069689a5a46ff4" exitCode=255 Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.272174 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"29a9efdb9482e02ce084c5d662ca8b4cb7d378257b890ac0cc069689a5a46ff4"} Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.272360 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.273581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.273624 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.273635 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.274294 4892 scope.go:117] "RemoveContainer" containerID="29a9efdb9482e02ce084c5d662ca8b4cb7d378257b890ac0cc069689a5a46ff4" Oct 06 12:08:48 crc kubenswrapper[4892]: W1006 12:08:48.807372 4892 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.807782 4892 trace.go:236] Trace[1718652236]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 12:08:38.805) (total time: 10002ms): Oct 06 12:08:48 crc kubenswrapper[4892]: Trace[1718652236]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (12:08:48.807) Oct 06 12:08:48 crc kubenswrapper[4892]: Trace[1718652236]: [10.002342923s] [10.002342923s] END Oct 06 12:08:48 crc kubenswrapper[4892]: E1006 12:08:48.807871 4892 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.989150 4892 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.989221 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.996581 4892 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 06 12:08:48 crc kubenswrapper[4892]: I1006 12:08:48.996624 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.009415 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.009595 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.011304 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.011364 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.011376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.276746 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.278769 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1"} Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.278992 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.280079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.280112 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:49 crc kubenswrapper[4892]: I1006 12:08:49.280121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:50 crc kubenswrapper[4892]: I1006 12:08:50.614082 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:50 crc kubenswrapper[4892]: I1006 12:08:50.614310 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:50 crc kubenswrapper[4892]: I1006 12:08:50.614386 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:50 crc kubenswrapper[4892]: I1006 12:08:50.615990 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:50 crc kubenswrapper[4892]: I1006 12:08:50.616068 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:50 crc kubenswrapper[4892]: I1006 12:08:50.616085 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:50 crc kubenswrapper[4892]: I1006 12:08:50.620970 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:08:51 crc kubenswrapper[4892]: I1006 12:08:51.284429 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:51 crc kubenswrapper[4892]: I1006 12:08:51.285705 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:51 crc kubenswrapper[4892]: I1006 12:08:51.285762 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:51 crc kubenswrapper[4892]: I1006 12:08:51.285777 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:52 crc kubenswrapper[4892]: I1006 12:08:52.286060 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:52 crc kubenswrapper[4892]: I1006 12:08:52.286922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:52 crc kubenswrapper[4892]: I1006 12:08:52.287008 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:52 crc kubenswrapper[4892]: I1006 12:08:52.287033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:52 crc kubenswrapper[4892]: I1006 12:08:52.522155 4892 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.576713 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.577692 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.579187 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.579268 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.579284 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.596122 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 06 12:08:53 crc kubenswrapper[4892]: E1006 12:08:53.957641 4892 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.960138 4892 trace.go:236] Trace[1101265710]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 12:08:42.667) (total time: 11292ms): Oct 06 12:08:53 crc kubenswrapper[4892]: Trace[1101265710]: ---"Objects listed" error: 11292ms (12:08:53.960) Oct 06 12:08:53 crc kubenswrapper[4892]: Trace[1101265710]: [11.292101749s] [11.292101749s] END Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.960211 4892 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.960151 4892 trace.go:236] Trace[1935446218]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 12:08:42.198) (total time: 11761ms): Oct 06 12:08:53 crc kubenswrapper[4892]: Trace[1935446218]: ---"Objects listed" error: 11761ms (12:08:53.960) Oct 06 12:08:53 crc kubenswrapper[4892]: Trace[1935446218]: [11.761120372s] [11.761120372s] END Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.960277 4892 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.961862 4892 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.962028 4892 trace.go:236] Trace[623777582]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (06-Oct-2025 12:08:43.820) (total time: 10141ms): Oct 06 12:08:53 crc kubenswrapper[4892]: Trace[623777582]: ---"Objects listed" error: 10141ms (12:08:53.961) Oct 06 12:08:53 crc kubenswrapper[4892]: Trace[623777582]: [10.141812285s] [10.141812285s] END Oct 06 12:08:53 crc kubenswrapper[4892]: I1006 12:08:53.962058 4892 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 06 12:08:53 crc kubenswrapper[4892]: E1006 12:08:53.962530 4892 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.040427 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.052925 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.092471 4892 apiserver.go:52] "Watching apiserver" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.104174 4892 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.104593 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.105171 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.105226 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.105903 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.106093 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.106219 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.106956 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.107029 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.107118 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.107173 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.107896 4892 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.109036 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.110708 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.110981 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.111200 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.111443 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.114057 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.118388 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.118841 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.118927 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.152635 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163023 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163077 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163103 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163124 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163146 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163167 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163190 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163220 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163242 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163261 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163282 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163301 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163339 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163368 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163390 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163411 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163432 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163465 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163485 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163507 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163527 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163549 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163570 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163592 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163612 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163633 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163654 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163674 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163700 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163720 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163742 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163766 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163797 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163826 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163847 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163869 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163891 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163912 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163956 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.163978 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164196 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164222 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164243 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164264 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164285 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164311 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164354 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164380 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164401 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164445 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164466 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164487 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164508 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164531 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164553 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164577 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164600 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164632 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164656 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164681 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164706 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164731 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164757 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164781 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164804 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164828 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164852 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164882 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164903 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164925 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164947 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.164966 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165008 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165028 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165049 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165072 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165093 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165115 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165138 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165158 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165177 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165196 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165217 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165237 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165259 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165282 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165305 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165352 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165379 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165401 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165431 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165454 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165476 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165497 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165520 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165541 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165561 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165583 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165607 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165635 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165660 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165681 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165703 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165725 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165747 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165768 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165791 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165812 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165834 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165856 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165880 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165901 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165926 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165951 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.165974 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166010 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166034 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166056 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166079 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166104 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166127 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166151 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166174 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166195 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166220 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166246 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166272 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166295 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166335 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166372 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166405 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166427 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166450 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166473 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166497 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166526 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166549 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166571 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166595 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166625 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166651 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166674 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166699 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166722 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166746 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166772 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166794 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166817 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166840 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166864 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.166887 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167194 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167222 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167249 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167275 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167302 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167377 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167424 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167462 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167489 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167524 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167537 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167549 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167574 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167600 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167626 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167656 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167682 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167716 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167742 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167768 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167792 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167829 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167857 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167881 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167904 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167929 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167955 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167980 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168004 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168027 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168052 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168078 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168103 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168129 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168152 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168175 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168201 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168224 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168249 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168274 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168299 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168343 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168369 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168399 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168473 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168505 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168538 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168567 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168594 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168622 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168651 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168674 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168700 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168730 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168755 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168779 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168807 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168831 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168899 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168917 4892 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.174208 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.175445 4892 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167692 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167685 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167744 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167773 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167796 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167805 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167828 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167900 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.167977 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168024 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168058 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168078 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168214 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168281 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168339 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168377 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.179022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168541 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168577 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168834 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168835 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168891 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168903 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168963 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168989 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169045 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169060 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169087 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169195 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169233 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.179217 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169264 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169353 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169378 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169546 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169606 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169648 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169681 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169708 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.169809 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172209 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172376 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172385 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172414 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172503 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172509 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172570 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172600 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172649 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172746 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172777 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172787 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172884 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172909 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.172915 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173043 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173042 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173038 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173202 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173335 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173414 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173489 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173507 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173567 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173634 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173732 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173743 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173756 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173762 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173867 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173883 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.173952 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.174017 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.174037 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.174075 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.174079 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.175266 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.175773 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.175832 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176122 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176142 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176201 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176268 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176354 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176425 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176436 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176532 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.176605 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:08:54.676589077 +0000 UTC m=+21.226294842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176746 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176747 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176794 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.176909 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.177009 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.177038 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.177055 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.177223 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178047 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178118 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178181 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178232 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178245 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178417 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178543 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178872 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.178872 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.168504 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.179264 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.179477 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.179724 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.179590 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.179865 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.180297 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.180449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.180779 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.180979 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.181045 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.181404 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.181426 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.181503 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.181741 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.181763 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.181999 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182135 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182159 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182346 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182380 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182390 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182425 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182486 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182517 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.182843 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.183231 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.183255 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.179563 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.183383 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:54.683316068 +0000 UTC m=+21.233021893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.183453 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.183766 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.183809 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:54.683794881 +0000 UTC m=+21.233500646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.184013 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.184052 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.184063 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.184221 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.184415 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.184434 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.185146 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.185714 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.186075 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.186178 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.186392 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.186788 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.186826 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.187345 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.187582 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.187593 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.187851 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.187862 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.188139 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.188154 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.188435 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.188516 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.188927 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.188942 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.189913 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.191176 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.192377 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.194631 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.194938 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.194395 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.196449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.196737 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.196843 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.196945 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.197247 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.197347 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.198672 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.198878 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.199088 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.199108 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.199126 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.199247 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:54.699159418 +0000 UTC m=+21.248865183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.199090 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.199823 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.199850 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.201737 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.202017 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.202272 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.202293 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.202304 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.202367 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:54.702350278 +0000 UTC m=+21.252056043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.202746 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.204312 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.204425 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.205025 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.205529 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.208540 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.208598 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.208784 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.208796 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.208975 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.209082 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.209097 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.209653 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.210588 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.210637 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.210749 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.211681 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.211856 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.213178 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.213237 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.213342 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.216533 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.217505 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.221091 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.221104 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.223523 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.226070 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.226747 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.229900 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.230475 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.230892 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.237954 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.238066 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.240600 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.241384 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.241856 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.242429 4892 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.245152 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.247102 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.249028 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.250194 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.250727 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.252687 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.253217 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.253899 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.254465 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.255941 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.257447 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.258089 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.258277 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.258625 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.258904 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.260029 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.260729 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.261735 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.262345 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.263270 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.264222 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.265207 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.265946 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.266031 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.266569 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.267636 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.268223 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.268759 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.269628 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-scxvc"] Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.269793 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.269909 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270022 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.269912 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-scxvc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270127 4892 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270186 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270244 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270300 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270384 4892 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270441 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270496 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270551 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270606 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270660 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270710 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270530 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270776 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270880 4892 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270938 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.270993 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271047 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271097 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271147 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271203 4892 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271258 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271311 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271400 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271462 4892 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271517 4892 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271578 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271635 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271689 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271747 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271802 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271859 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271916 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.271971 4892 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272020 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272068 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272123 4892 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272177 4892 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272232 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272290 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272366 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272424 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272478 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272533 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272591 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272651 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272710 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272761 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272842 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272894 4892 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272974 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273055 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273132 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273193 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273254 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273310 4892 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273390 4892 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273448 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273507 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273563 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273626 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273688 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273745 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273800 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273855 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273916 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273974 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274028 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274099 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274159 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274213 4892 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274273 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274347 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.272923 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274402 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274554 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274566 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274577 4892 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274586 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274596 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274605 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274613 4892 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274621 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274631 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274640 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274649 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274659 4892 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274668 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274675 4892 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274684 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274693 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274702 4892 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274710 4892 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274722 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274732 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274745 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274761 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274772 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274784 4892 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274795 4892 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274806 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274817 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274828 4892 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274839 4892 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274851 4892 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274859 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274868 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274878 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274888 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.274897 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273142 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.273368 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275411 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275428 4892 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275437 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275445 4892 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275454 4892 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275463 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275474 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275483 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275493 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275501 4892 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275509 4892 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275521 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275530 4892 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275538 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275546 4892 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275554 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275563 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275573 4892 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275583 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275592 4892 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275601 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275612 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275621 4892 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275631 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275639 4892 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275647 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275656 4892 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275665 4892 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275673 4892 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275692 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275703 4892 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275711 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275719 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275728 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275735 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275745 4892 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275753 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275761 4892 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275773 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275781 4892 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275789 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275797 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275805 4892 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275813 4892 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275821 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275829 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275838 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275845 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275853 4892 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275862 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275871 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275879 4892 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275935 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275943 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275951 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275961 4892 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275969 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275976 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275985 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.275993 4892 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276000 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276009 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276017 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276027 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276035 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276043 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276051 4892 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276061 4892 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276069 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276077 4892 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276086 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276099 4892 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276107 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276115 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276122 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276130 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276140 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276148 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276156 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276163 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276171 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276179 4892 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276187 4892 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276195 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276203 4892 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.276833 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.287799 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.295220 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.304443 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.317155 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.323817 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.323934 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.334821 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.344915 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.355830 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.366684 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.375377 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.376608 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/452fb3c0-569f-4c83-ba2a-7e3bafcd509d-hosts-file\") pod \"node-resolver-scxvc\" (UID: \"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\") " pod="openshift-dns/node-resolver-scxvc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.376655 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq6vg\" (UniqueName: \"kubernetes.io/projected/452fb3c0-569f-4c83-ba2a-7e3bafcd509d-kube-api-access-hq6vg\") pod \"node-resolver-scxvc\" (UID: \"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\") " pod="openshift-dns/node-resolver-scxvc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.384162 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.391814 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.399764 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.408554 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.420069 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.429011 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.429970 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.436547 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 06 12:08:54 crc kubenswrapper[4892]: W1006 12:08:54.440368 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-b6da3daf7f0dc3a9f4849225fc1f35977aab95d1b563bf10455cf2ef57d6b855 WatchSource:0}: Error finding container b6da3daf7f0dc3a9f4849225fc1f35977aab95d1b563bf10455cf2ef57d6b855: Status 404 returned error can't find the container with id b6da3daf7f0dc3a9f4849225fc1f35977aab95d1b563bf10455cf2ef57d6b855 Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.444361 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: W1006 12:08:54.448430 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b7d20a0d256125ee4b36eca25ab5c73474f437569b71602e2b02560d072a95f2 WatchSource:0}: Error finding container b7d20a0d256125ee4b36eca25ab5c73474f437569b71602e2b02560d072a95f2: Status 404 returned error can't find the container with id b7d20a0d256125ee4b36eca25ab5c73474f437569b71602e2b02560d072a95f2 Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.456986 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.473006 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.479337 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq6vg\" (UniqueName: \"kubernetes.io/projected/452fb3c0-569f-4c83-ba2a-7e3bafcd509d-kube-api-access-hq6vg\") pod \"node-resolver-scxvc\" (UID: \"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\") " pod="openshift-dns/node-resolver-scxvc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.479387 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/452fb3c0-569f-4c83-ba2a-7e3bafcd509d-hosts-file\") pod \"node-resolver-scxvc\" (UID: \"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\") " pod="openshift-dns/node-resolver-scxvc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.479446 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/452fb3c0-569f-4c83-ba2a-7e3bafcd509d-hosts-file\") pod \"node-resolver-scxvc\" (UID: \"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\") " pod="openshift-dns/node-resolver-scxvc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.492164 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.502458 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq6vg\" (UniqueName: \"kubernetes.io/projected/452fb3c0-569f-4c83-ba2a-7e3bafcd509d-kube-api-access-hq6vg\") pod \"node-resolver-scxvc\" (UID: \"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\") " pod="openshift-dns/node-resolver-scxvc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.587308 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-scxvc" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.680611 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.680735 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:08:55.680722274 +0000 UTC m=+22.230428039 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.781548 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.781599 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.781626 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:08:54 crc kubenswrapper[4892]: I1006 12:08:54.781651 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781715 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781736 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781771 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:55.781754892 +0000 UTC m=+22.331460657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781799 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:55.781792123 +0000 UTC m=+22.331497888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781846 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781861 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781862 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781874 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781878 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781889 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781910 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:55.781894886 +0000 UTC m=+22.331600661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:54 crc kubenswrapper[4892]: E1006 12:08:54.781928 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:55.781920236 +0000 UTC m=+22.331626011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.032781 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-djjtr"] Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.033184 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.035644 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.035874 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.036003 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.038155 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.050760 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.067443 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.078354 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.090108 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.106464 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.114727 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.124395 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.135719 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.147414 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.159875 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.168125 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.168350 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.184604 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b933165-d6e5-4add-ada2-6c87697e668b-serviceca\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.184676 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmlmn\" (UniqueName: \"kubernetes.io/projected/1b933165-d6e5-4add-ada2-6c87697e668b-kube-api-access-xmlmn\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.184706 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b933165-d6e5-4add-ada2-6c87697e668b-host\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.285298 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmlmn\" (UniqueName: \"kubernetes.io/projected/1b933165-d6e5-4add-ada2-6c87697e668b-kube-api-access-xmlmn\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.285419 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b933165-d6e5-4add-ada2-6c87697e668b-host\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.285463 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b933165-d6e5-4add-ada2-6c87697e668b-serviceca\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.285640 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b933165-d6e5-4add-ada2-6c87697e668b-host\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.286599 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b933165-d6e5-4add-ada2-6c87697e668b-serviceca\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.294841 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b7d20a0d256125ee4b36eca25ab5c73474f437569b71602e2b02560d072a95f2"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.296879 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.297520 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b09af6e88495b030023fa7cedcf9f338a127459216c15f16b4f1b0dafd19d6af"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.299363 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.300142 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.302453 4892 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1" exitCode=255 Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.302536 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.302597 4892 scope.go:117] "RemoveContainer" containerID="29a9efdb9482e02ce084c5d662ca8b4cb7d378257b890ac0cc069689a5a46ff4" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.305148 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.305190 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.305212 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b6da3daf7f0dc3a9f4849225fc1f35977aab95d1b563bf10455cf2ef57d6b855"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.307464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-scxvc" event={"ID":"452fb3c0-569f-4c83-ba2a-7e3bafcd509d","Type":"ContainerStarted","Data":"cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.307507 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-scxvc" event={"ID":"452fb3c0-569f-4c83-ba2a-7e3bafcd509d","Type":"ContainerStarted","Data":"e53825aff5a265b2d959b3e22f29ab9f4ab594ee1b88b068f7bc87150ac263b5"} Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.314252 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmlmn\" (UniqueName: \"kubernetes.io/projected/1b933165-d6e5-4add-ada2-6c87697e668b-kube-api-access-xmlmn\") pod \"node-ca-djjtr\" (UID: \"1b933165-d6e5-4add-ada2-6c87697e668b\") " pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.322213 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.342383 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.348895 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-djjtr" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.364043 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.384220 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.402177 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.402360 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4t26s"] Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.402813 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.404807 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.404873 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.404983 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.405012 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-5zfsp"] Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.405082 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.405304 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.405634 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.408776 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.408833 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.408855 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.408834 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.408971 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.422780 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.434089 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.434206 4892 scope.go:117] "RemoveContainer" containerID="8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1" Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.434381 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.444619 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.458464 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.468974 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.477906 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487500 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df1cea25-4170-457d-b579-2678161d7d53-cni-binary-copy\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487535 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-netns\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487552 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-mcd-auth-proxy-config\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487573 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-socket-dir-parent\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-etc-kubernetes\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487601 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-os-release\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487634 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-k8s-cni-cncf-io\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487649 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7x7k\" (UniqueName: \"kubernetes.io/projected/df1cea25-4170-457d-b579-2678161d7d53-kube-api-access-h7x7k\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487682 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-kubelet\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487696 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-hostroot\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487710 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-conf-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487800 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-cni-bin\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487851 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-system-cni-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487904 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/df1cea25-4170-457d-b579-2678161d7d53-multus-daemon-config\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487924 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-cnibin\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-multus-certs\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487953 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-proxy-tls\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487977 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-cni-multus\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.487993 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-cni-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.488009 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-rootfs\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.488027 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjh7n\" (UniqueName: \"kubernetes.io/projected/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-kube-api-access-xjh7n\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.494204 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.506233 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a9efdb9482e02ce084c5d662ca8b4cb7d378257b890ac0cc069689a5a46ff4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:48Z\\\",\\\"message\\\":\\\"W1006 12:08:37.571125 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 12:08:37.571550 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759752517 cert, and key in /tmp/serving-cert-64910678/serving-signer.crt, /tmp/serving-cert-64910678/serving-signer.key\\\\nI1006 12:08:37.894951 1 observer_polling.go:159] Starting file observer\\\\nW1006 12:08:37.897168 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 12:08:37.897283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:37.899285 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-64910678/tls.crt::/tmp/serving-cert-64910678/tls.key\\\\\\\"\\\\nF1006 12:08:48.158042 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.518214 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.531713 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.558612 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.577885 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589115 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-cni-bin\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589175 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-system-cni-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589192 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/df1cea25-4170-457d-b579-2678161d7d53-multus-daemon-config\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589206 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-cnibin\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589220 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-multus-certs\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589234 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-proxy-tls\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589229 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-cni-bin\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589294 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-cni-multus\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589254 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-cni-multus\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589354 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-cnibin\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589375 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-cni-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589384 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-multus-certs\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589410 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-rootfs\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589437 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjh7n\" (UniqueName: \"kubernetes.io/projected/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-kube-api-access-xjh7n\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589467 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df1cea25-4170-457d-b579-2678161d7d53-cni-binary-copy\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589459 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-system-cni-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589491 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-netns\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589516 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-mcd-auth-proxy-config\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589542 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-socket-dir-parent\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-etc-kubernetes\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589593 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-os-release\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589619 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-k8s-cni-cncf-io\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589640 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7x7k\" (UniqueName: \"kubernetes.io/projected/df1cea25-4170-457d-b579-2678161d7d53-kube-api-access-h7x7k\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589701 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-kubelet\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589724 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-hostroot\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589748 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-conf-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589810 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-conf-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.589897 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-rootfs\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590056 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-cni-dir\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590093 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-os-release\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590100 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/df1cea25-4170-457d-b579-2678161d7d53-multus-daemon-config\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590142 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-multus-socket-dir-parent\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590173 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-etc-kubernetes\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590216 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-k8s-cni-cncf-io\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590223 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-run-netns\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590252 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-host-var-lib-kubelet\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590256 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/df1cea25-4170-457d-b579-2678161d7d53-hostroot\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590537 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/df1cea25-4170-457d-b579-2678161d7d53-cni-binary-copy\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.590730 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-mcd-auth-proxy-config\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.591464 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.595403 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-proxy-tls\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.603416 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjh7n\" (UniqueName: \"kubernetes.io/projected/f0107ee8-a9e2-4a14-b044-1c37a9df4d38-kube-api-access-xjh7n\") pod \"machine-config-daemon-4t26s\" (UID: \"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\") " pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.604732 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.611632 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7x7k\" (UniqueName: \"kubernetes.io/projected/df1cea25-4170-457d-b579-2678161d7d53-kube-api-access-h7x7k\") pod \"multus-5zfsp\" (UID: \"df1cea25-4170-457d-b579-2678161d7d53\") " pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.619201 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.629112 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.642788 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.658281 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.673161 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.699474 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.699718 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:08:57.699690633 +0000 UTC m=+24.249396398 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.722864 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.728810 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-5zfsp" Oct 06 12:08:55 crc kubenswrapper[4892]: W1006 12:08:55.732362 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0107ee8_a9e2_4a14_b044_1c37a9df4d38.slice/crio-64821f4a5f56d2e7b0a04758fc88c475ad8fd08cab476e999d54c295d1013ed4 WatchSource:0}: Error finding container 64821f4a5f56d2e7b0a04758fc88c475ad8fd08cab476e999d54c295d1013ed4: Status 404 returned error can't find the container with id 64821f4a5f56d2e7b0a04758fc88c475ad8fd08cab476e999d54c295d1013ed4 Oct 06 12:08:55 crc kubenswrapper[4892]: W1006 12:08:55.740397 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf1cea25_4170_457d_b579_2678161d7d53.slice/crio-bdc0f01108f176084e612ba0f007247ee01b9015aa4f10faf5299bb61a2685e9 WatchSource:0}: Error finding container bdc0f01108f176084e612ba0f007247ee01b9015aa4f10faf5299bb61a2685e9: Status 404 returned error can't find the container with id bdc0f01108f176084e612ba0f007247ee01b9015aa4f10faf5299bb61a2685e9 Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.793220 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xnzdd"] Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.793822 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.795387 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.795670 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.795679 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cxmhh"] Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.796547 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.798452 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.798509 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.798917 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.798991 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.800081 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.800094 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.800269 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.800286 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.800303 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.800335 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.800354 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800407 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800407 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800451 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:57.800439282 +0000 UTC m=+24.350145047 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800451 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800479 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800490 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800503 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800463 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:57.800458632 +0000 UTC m=+24.350164397 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800523 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800548 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800566 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:57.800538165 +0000 UTC m=+24.350243930 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:55 crc kubenswrapper[4892]: E1006 12:08:55.800601 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 12:08:57.800584346 +0000 UTC m=+24.350290201 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.811626 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.825107 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.844092 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.856028 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.869269 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.883882 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.897498 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901387 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b947557-ac75-461e-8603-9c3ce29ad5ab-cni-binary-copy\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901415 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-config\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901434 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901450 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f49rw\" (UniqueName: \"kubernetes.io/projected/9b947557-ac75-461e-8603-9c3ce29ad5ab-kube-api-access-f49rw\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901467 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901541 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-ovn\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901563 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-env-overrides\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901595 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-var-lib-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901612 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovn-node-metrics-cert\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901627 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-script-lib\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901644 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-os-release\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901659 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-kubelet\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901728 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-netns\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901768 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-systemd-units\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901829 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swtk8\" (UniqueName: \"kubernetes.io/projected/e115ba33-9ba0-42d6-82a0-09ef8c996788-kube-api-access-swtk8\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901855 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-netd\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901891 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-etc-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901928 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-slash\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901943 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-log-socket\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901958 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-ovn-kubernetes\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.901998 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-system-cni-dir\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.902023 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-systemd\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.902050 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-node-log\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.902068 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-cnibin\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.902084 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9b947557-ac75-461e-8603-9c3ce29ad5ab-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.902101 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-bin\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.907180 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.917120 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.926951 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.950205 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.961880 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a9efdb9482e02ce084c5d662ca8b4cb7d378257b890ac0cc069689a5a46ff4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:48Z\\\",\\\"message\\\":\\\"W1006 12:08:37.571125 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 12:08:37.571550 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759752517 cert, and key in /tmp/serving-cert-64910678/serving-signer.crt, /tmp/serving-cert-64910678/serving-signer.key\\\\nI1006 12:08:37.894951 1 observer_polling.go:159] Starting file observer\\\\nW1006 12:08:37.897168 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 12:08:37.897283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:37.899285 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-64910678/tls.crt::/tmp/serving-cert-64910678/tls.key\\\\\\\"\\\\nF1006 12:08:48.158042 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:55 crc kubenswrapper[4892]: I1006 12:08:55.977067 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002729 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-var-lib-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovn-node-metrics-cert\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002801 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-script-lib\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002823 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-os-release\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002844 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-kubelet\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002841 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-var-lib-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002909 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-netns\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002864 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-netns\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002949 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-kubelet\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002958 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.002991 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-systemd-units\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003013 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-netd\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003015 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-os-release\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003031 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swtk8\" (UniqueName: \"kubernetes.io/projected/e115ba33-9ba0-42d6-82a0-09ef8c996788-kube-api-access-swtk8\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003106 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-etc-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-netd\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003152 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-log-socket\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003110 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-systemd-units\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003178 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-ovn-kubernetes\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003203 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-log-socket\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003205 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-system-cni-dir\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003199 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003253 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-ovn-kubernetes\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003230 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-system-cni-dir\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003204 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-etc-openvswitch\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003238 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-slash\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003258 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-slash\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003356 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-systemd\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-node-log\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003410 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9b947557-ac75-461e-8603-9c3ce29ad5ab-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003417 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-systemd\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003433 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-bin\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003470 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-node-log\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003478 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-bin\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003498 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-cnibin\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003525 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-config\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003549 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-cnibin\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003563 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b947557-ac75-461e-8603-9c3ce29ad5ab-cni-binary-copy\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003920 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003944 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f49rw\" (UniqueName: \"kubernetes.io/projected/9b947557-ac75-461e-8603-9c3ce29ad5ab-kube-api-access-f49rw\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003965 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.003989 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-script-lib\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004016 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-env-overrides\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004037 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-ovn\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004087 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9b947557-ac75-461e-8603-9c3ce29ad5ab-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004114 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-ovn\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004274 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004470 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9b947557-ac75-461e-8603-9c3ce29ad5ab-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004599 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-config\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004830 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9b947557-ac75-461e-8603-9c3ce29ad5ab-cni-binary-copy\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.004839 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-env-overrides\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.006153 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovn-node-metrics-cert\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.006893 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.027181 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swtk8\" (UniqueName: \"kubernetes.io/projected/e115ba33-9ba0-42d6-82a0-09ef8c996788-kube-api-access-swtk8\") pod \"ovnkube-node-cxmhh\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.048796 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f49rw\" (UniqueName: \"kubernetes.io/projected/9b947557-ac75-461e-8603-9c3ce29ad5ab-kube-api-access-f49rw\") pod \"multus-additional-cni-plugins-xnzdd\" (UID: \"9b947557-ac75-461e-8603-9c3ce29ad5ab\") " pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.086685 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.107063 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.113476 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:08:56 crc kubenswrapper[4892]: W1006 12:08:56.117847 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b947557_ac75_461e_8603_9c3ce29ad5ab.slice/crio-6f8e26e97f2bc582101eccd154b6cf7ad97b1da68dc6eb7a34c203f60f8f836b WatchSource:0}: Error finding container 6f8e26e97f2bc582101eccd154b6cf7ad97b1da68dc6eb7a34c203f60f8f836b: Status 404 returned error can't find the container with id 6f8e26e97f2bc582101eccd154b6cf7ad97b1da68dc6eb7a34c203f60f8f836b Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.135471 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.168537 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:08:56 crc kubenswrapper[4892]: E1006 12:08:56.168644 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.169132 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:08:56 crc kubenswrapper[4892]: E1006 12:08:56.169211 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.172913 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.173832 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.173871 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.175577 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.176376 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.177602 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.178199 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.178959 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.180126 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.181046 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.182184 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.183221 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.183772 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.184817 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.185576 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.186764 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.201956 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29a9efdb9482e02ce084c5d662ca8b4cb7d378257b890ac0cc069689a5a46ff4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:48Z\\\",\\\"message\\\":\\\"W1006 12:08:37.571125 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1006 12:08:37.571550 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759752517 cert, and key in /tmp/serving-cert-64910678/serving-signer.crt, /tmp/serving-cert-64910678/serving-signer.key\\\\nI1006 12:08:37.894951 1 observer_polling.go:159] Starting file observer\\\\nW1006 12:08:37.897168 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1006 12:08:37.897283 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:37.899285 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-64910678/tls.crt::/tmp/serving-cert-64910678/tls.key\\\\\\\"\\\\nF1006 12:08:48.158042 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.242604 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.288816 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.311001 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" event={"ID":"9b947557-ac75-461e-8603-9c3ce29ad5ab","Type":"ContainerStarted","Data":"6f8e26e97f2bc582101eccd154b6cf7ad97b1da68dc6eb7a34c203f60f8f836b"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.313848 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b" exitCode=0 Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.313910 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.313935 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"78d656be122ec40f7a14867c983b72f6b24e17f230349e4654869a98b3b017a7"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.317597 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-djjtr" event={"ID":"1b933165-d6e5-4add-ada2-6c87697e668b","Type":"ContainerStarted","Data":"abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.317646 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-djjtr" event={"ID":"1b933165-d6e5-4add-ada2-6c87697e668b","Type":"ContainerStarted","Data":"2e0b940312fb32e4441691eeb7c3091f56f36c2c52bdc48dd1692cf9b5145689"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.319292 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.321692 4892 scope.go:117] "RemoveContainer" containerID="8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1" Oct 06 12:08:56 crc kubenswrapper[4892]: E1006 12:08:56.321875 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.322872 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zfsp" event={"ID":"df1cea25-4170-457d-b579-2678161d7d53","Type":"ContainerStarted","Data":"7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.322901 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zfsp" event={"ID":"df1cea25-4170-457d-b579-2678161d7d53","Type":"ContainerStarted","Data":"bdc0f01108f176084e612ba0f007247ee01b9015aa4f10faf5299bb61a2685e9"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.324674 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.324698 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.324711 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"64821f4a5f56d2e7b0a04758fc88c475ad8fd08cab476e999d54c295d1013ed4"} Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.327689 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.364576 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.401787 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.444440 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.483189 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.532293 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.563909 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.601688 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.643767 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.683020 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.722283 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.761785 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.805806 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.845567 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.895474 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.926746 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:56 crc kubenswrapper[4892]: I1006 12:08:56.966646 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:56Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.004000 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.050841 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.083111 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.124804 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.159948 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.168237 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.168444 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.203619 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.243545 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.330929 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2"} Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.332595 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b947557-ac75-461e-8603-9c3ce29ad5ab" containerID="4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9" exitCode=0 Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.332690 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" event={"ID":"9b947557-ac75-461e-8603-9c3ce29ad5ab","Type":"ContainerDied","Data":"4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9"} Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.337271 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.337315 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.337363 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.337381 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.337399 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.337417 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.345577 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.364545 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.380608 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.408469 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.441207 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.486646 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.523105 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.567521 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.604210 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.643071 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.689134 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.720511 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.720686 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:09:01.720659606 +0000 UTC m=+28.270365411 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.724610 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.766728 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.811222 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.821975 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.822021 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.822046 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.822093 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822132 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822178 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822203 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822206 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822216 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822226 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822219 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822239 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822208 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:01.822190118 +0000 UTC m=+28.371895883 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822340 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:01.822301811 +0000 UTC m=+28.372007566 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822363 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:01.822356252 +0000 UTC m=+28.372062017 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:08:57 crc kubenswrapper[4892]: E1006 12:08:57.822382 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:01.822374963 +0000 UTC m=+28.372080728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.844634 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.882569 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.930277 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:57 crc kubenswrapper[4892]: I1006 12:08:57.967268 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:57Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.004138 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.050627 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.084924 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.127692 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.168584 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.168684 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:08:58 crc kubenswrapper[4892]: E1006 12:08:58.168847 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:08:58 crc kubenswrapper[4892]: E1006 12:08:58.169108 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.171367 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.209607 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.246945 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.288878 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.322059 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.342174 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b947557-ac75-461e-8603-9c3ce29ad5ab" containerID="f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30" exitCode=0 Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.342232 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" event={"ID":"9b947557-ac75-461e-8603-9c3ce29ad5ab","Type":"ContainerDied","Data":"f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30"} Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.365528 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.413285 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.450126 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.485542 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.522363 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.568210 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.602880 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.643632 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.681705 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.727509 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.774061 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.808228 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.844614 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.888440 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.924057 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:58 crc kubenswrapper[4892]: I1006 12:08:58.965351 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:58Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.001818 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.042973 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.168363 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:08:59 crc kubenswrapper[4892]: E1006 12:08:59.168480 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.349364 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b947557-ac75-461e-8603-9c3ce29ad5ab" containerID="208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3" exitCode=0 Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.349462 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" event={"ID":"9b947557-ac75-461e-8603-9c3ce29ad5ab","Type":"ContainerDied","Data":"208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3"} Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.357647 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.377496 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.399669 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.420165 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.436851 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.457626 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.478135 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.496447 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.514596 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.527696 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.542022 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.565063 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.581964 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.604443 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.620476 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:08:59 crc kubenswrapper[4892]: I1006 12:08:59.651666 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:08:59Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.167968 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.168042 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:00 crc kubenswrapper[4892]: E1006 12:09:00.168220 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:00 crc kubenswrapper[4892]: E1006 12:09:00.168760 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.362622 4892 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.365131 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.365205 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.365233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.365459 4892 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.367987 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b947557-ac75-461e-8603-9c3ce29ad5ab" containerID="fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606" exitCode=0 Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.368053 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" event={"ID":"9b947557-ac75-461e-8603-9c3ce29ad5ab","Type":"ContainerDied","Data":"fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606"} Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.377886 4892 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.378288 4892 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.379932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.379989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.380007 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.380032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.380051 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.392690 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: E1006 12:09:00.402567 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.408627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.408670 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.408682 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.408701 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.408712 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.411034 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.424838 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: E1006 12:09:00.432661 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.437429 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.437496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.437521 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.437549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.437570 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.444510 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.460954 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: E1006 12:09:00.462447 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.466798 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.466846 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.466862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.466885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.466904 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: E1006 12:09:00.484725 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.488768 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.491006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.491076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.491094 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.491116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.491133 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: E1006 12:09:00.505031 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: E1006 12:09:00.505268 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.507107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.507161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.507177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.507197 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.507211 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.515412 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.527180 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.540682 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.571304 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.592918 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.603674 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.609091 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.609122 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.609130 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.609143 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.609151 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.620832 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.635015 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.647810 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:00Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.712252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.712367 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.712387 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.712417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.712436 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.815419 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.815490 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.815507 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.815535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.815555 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.919601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.919668 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.919692 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.919723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:00 crc kubenswrapper[4892]: I1006 12:09:00.919744 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:00Z","lastTransitionTime":"2025-10-06T12:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.023622 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.023686 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.023705 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.023728 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.023746 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.127938 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.128013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.128034 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.128060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.128073 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.168089 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.168548 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.231472 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.231511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.231519 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.231532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.231543 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.333541 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.333577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.333590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.333608 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.333620 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.375160 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b947557-ac75-461e-8603-9c3ce29ad5ab" containerID="9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a" exitCode=0 Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.375239 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" event={"ID":"9b947557-ac75-461e-8603-9c3ce29ad5ab","Type":"ContainerDied","Data":"9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.389246 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.389641 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.389663 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.389674 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.389956 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.405208 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.415395 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.415488 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.420299 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.437530 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.438000 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.438039 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.438050 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.438067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.438078 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.450036 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.465646 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.477815 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.486959 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.499436 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.513216 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.531765 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.540895 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.540932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.540943 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.540960 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.540972 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.545374 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.557024 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.568903 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.586063 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.598507 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.610553 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.621724 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.631249 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.640945 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.643918 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.643955 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.643969 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.643989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.644003 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.651436 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.668522 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.695485 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.718408 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.735393 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.745999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.746038 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.746049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.746063 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.746073 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.748498 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.759294 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.761596 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.761759 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:09:09.761739131 +0000 UTC m=+36.311444916 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.776501 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.795474 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.808740 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:01Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.848416 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.848462 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.848476 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.848491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.848502 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.862977 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.863165 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863112 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863264 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:09.863240052 +0000 UTC m=+36.412945827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863429 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863464 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863477 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863530 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:09.86351288 +0000 UTC m=+36.413218655 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.863650 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.863690 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863774 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863788 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863804 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863812 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863825 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:09.863814798 +0000 UTC m=+36.413520573 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:09:01 crc kubenswrapper[4892]: E1006 12:09:01.863844 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:09.863833669 +0000 UTC m=+36.413539434 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.951554 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.951595 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.951606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.951622 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:01 crc kubenswrapper[4892]: I1006 12:09:01.951631 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:01Z","lastTransitionTime":"2025-10-06T12:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.054407 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.054462 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.054476 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.054493 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.054506 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.158134 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.158196 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.158214 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.158237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.158254 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.168549 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.168638 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:02 crc kubenswrapper[4892]: E1006 12:09:02.169220 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:02 crc kubenswrapper[4892]: E1006 12:09:02.169379 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.261126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.261536 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.261699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.261917 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.262115 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.365844 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.366190 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.366357 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.366538 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.366713 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.399098 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b947557-ac75-461e-8603-9c3ce29ad5ab" containerID="b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb" exitCode=0 Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.399233 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" event={"ID":"9b947557-ac75-461e-8603-9c3ce29ad5ab","Type":"ContainerDied","Data":"b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.426728 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.456071 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.469976 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.470019 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.470033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.470057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.470071 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.478583 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.495919 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.508119 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.522280 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.540950 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.563291 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.572558 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.572584 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.572592 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.572603 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.572612 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.579382 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.589198 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.605157 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.617230 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.636867 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.660649 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.674726 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.674802 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.674841 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.674871 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.674896 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.680367 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.761735 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.762692 4892 scope.go:117] "RemoveContainer" containerID="8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1" Oct 06 12:09:02 crc kubenswrapper[4892]: E1006 12:09:02.762940 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.777303 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.777412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.777438 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.777468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.777490 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.880370 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.880463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.880488 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.880516 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.880536 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.983644 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.983705 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.983723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.983744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:02 crc kubenswrapper[4892]: I1006 12:09:02.983761 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:02Z","lastTransitionTime":"2025-10-06T12:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.086862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.086921 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.086937 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.086959 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.086975 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.168354 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:03 crc kubenswrapper[4892]: E1006 12:09:03.168535 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.189845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.189906 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.189923 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.189975 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.189993 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.292470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.292535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.292552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.292620 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.292638 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.395855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.395917 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.395939 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.395968 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.395994 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.408987 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" event={"ID":"9b947557-ac75-461e-8603-9c3ce29ad5ab","Type":"ContainerStarted","Data":"333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.431248 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.453725 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.472135 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.488660 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.499047 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.499117 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.499141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.499173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.499193 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.506877 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.542255 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.584183 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.601385 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.601412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.601422 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.601437 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.601448 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.617210 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.633833 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.655280 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.668496 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.680696 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.691647 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.701502 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.703252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.703313 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.703353 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.703376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.703390 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.715194 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:03Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.805231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.805283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.805293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.805307 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.805332 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.907252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.907283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.907293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.907308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:03 crc kubenswrapper[4892]: I1006 12:09:03.907317 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:03Z","lastTransitionTime":"2025-10-06T12:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.010840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.010915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.010938 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.010967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.010989 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.114512 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.114571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.114611 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.114642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.114661 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.167696 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.167859 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:04 crc kubenswrapper[4892]: E1006 12:09:04.167891 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:04 crc kubenswrapper[4892]: E1006 12:09:04.168078 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.183788 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.199403 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.213818 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.218604 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.218665 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.218684 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.218710 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.218734 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.228960 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.244122 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.277807 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.322572 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.322637 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.322660 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.322688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.322706 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.324583 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.347355 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: E1006 12:09:04.356466 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode115ba33_9ba0_42d6_82a0_09ef8c996788.slice/crio-27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.366226 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.384884 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.399642 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.414536 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/0.log" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.414989 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.418786 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa" exitCode=1 Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.418868 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.420488 4892 scope.go:117] "RemoveContainer" containerID="27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.424908 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.424989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.425016 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.425052 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.425076 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.434301 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.449595 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.468763 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.498896 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.515154 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.528499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.528570 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.528591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.528620 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.528640 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.532961 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.554282 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.578095 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:04Z\\\",\\\"message\\\":\\\"r removal\\\\nI1006 12:09:04.305154 6150 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 12:09:04.305149 6150 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:04.305240 6150 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:04.305215 6150 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:04.305228 6150 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:04.305274 6150 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:04.305312 6150 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:04.305362 6150 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:04.305317 6150 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:04.306828 6150 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 12:09:04.306858 6150 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 12:09:04.306889 6150 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 12:09:04.306943 6150 factory.go:656] Stopping watch factory\\\\nI1006 12:09:04.306967 6150 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:04.307003 6150 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 12:09:04.307023 6150 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.594650 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.612876 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.626044 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.635196 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.635256 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.635349 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.635374 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.635392 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.649651 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.666385 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.681720 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.694705 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.708670 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.719246 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.733772 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:04Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.737299 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.737382 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.737401 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.737425 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.737442 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.840789 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.840852 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.840872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.840898 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.840923 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.944090 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.944150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.944171 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.944197 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:04 crc kubenswrapper[4892]: I1006 12:09:04.944217 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:04Z","lastTransitionTime":"2025-10-06T12:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.047280 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.047396 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.047421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.047451 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.047473 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.150399 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.150438 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.150446 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.150463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.150473 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.168070 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:05 crc kubenswrapper[4892]: E1006 12:09:05.168274 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.252430 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.252495 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.252510 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.252532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.252548 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.354244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.354282 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.354293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.354309 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.354336 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.424128 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/0.log" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.427106 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.427455 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.440758 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.456868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.456940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.456962 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.456991 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.457012 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.468275 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:04Z\\\",\\\"message\\\":\\\"r removal\\\\nI1006 12:09:04.305154 6150 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 12:09:04.305149 6150 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:04.305240 6150 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:04.305215 6150 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:04.305228 6150 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:04.305274 6150 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:04.305312 6150 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:04.305362 6150 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:04.305317 6150 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:04.306828 6150 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 12:09:04.306858 6150 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 12:09:04.306889 6150 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 12:09:04.306943 6150 factory.go:656] Stopping watch factory\\\\nI1006 12:09:04.306967 6150 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:04.307003 6150 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 12:09:04.307023 6150 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.487359 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.501153 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.511931 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.521134 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.531786 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.544247 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.554247 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.559030 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.559092 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.559104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.559121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.559132 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.568984 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.582497 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.592434 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.601039 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.609604 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.619218 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:05Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.661172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.661217 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.661231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.661249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.661264 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.765124 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.765199 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.765215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.765240 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.765257 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.868073 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.868104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.868112 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.868127 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.868136 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.969855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.969921 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.969938 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.969966 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:05 crc kubenswrapper[4892]: I1006 12:09:05.969985 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:05Z","lastTransitionTime":"2025-10-06T12:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.073045 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.073144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.073172 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.073206 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.073248 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.168360 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.168557 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:06 crc kubenswrapper[4892]: E1006 12:09:06.168560 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:06 crc kubenswrapper[4892]: E1006 12:09:06.168797 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.176789 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.176847 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.176872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.176899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.176920 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.280124 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.280185 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.280202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.280230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.280248 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.383080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.383144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.383160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.383186 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.383203 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.433366 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/1.log" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.434142 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/0.log" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.438628 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118" exitCode=1 Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.438711 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.439127 4892 scope.go:117] "RemoveContainer" containerID="27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.440108 4892 scope.go:117] "RemoveContainer" containerID="d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118" Oct 06 12:09:06 crc kubenswrapper[4892]: E1006 12:09:06.440478 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.461692 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.479162 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.485478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.485525 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.485541 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.485563 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.485582 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.492420 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.503213 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.523651 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.543656 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.561286 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.574564 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.584149 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.587872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.587987 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.588047 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.588125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.588198 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.594930 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.608002 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.626026 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:04Z\\\",\\\"message\\\":\\\"r removal\\\\nI1006 12:09:04.305154 6150 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 12:09:04.305149 6150 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:04.305240 6150 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:04.305215 6150 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:04.305228 6150 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:04.305274 6150 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:04.305312 6150 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:04.305362 6150 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:04.305317 6150 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:04.306828 6150 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 12:09:04.306858 6150 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 12:09:04.306889 6150 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 12:09:04.306943 6150 factory.go:656] Stopping watch factory\\\\nI1006 12:09:04.306967 6150 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:04.307003 6150 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 12:09:04.307023 6150 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.647508 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.662948 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.681304 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:06Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.691210 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.691388 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.691412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.691476 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.691498 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.793982 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.794023 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.794062 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.794080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.794093 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.897533 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.897604 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.897628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.897658 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:06 crc kubenswrapper[4892]: I1006 12:09:06.897679 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:06Z","lastTransitionTime":"2025-10-06T12:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.000892 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.000980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.000996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.001019 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.001038 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.104412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.104453 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.104464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.104480 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.104495 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.167748 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:07 crc kubenswrapper[4892]: E1006 12:09:07.167968 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.207592 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.207622 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.207631 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.207643 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.207651 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.310507 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.310582 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.310607 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.310634 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.310650 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.413105 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.413145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.413153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.413167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.413175 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.442174 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8"] Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.443009 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.445713 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/1.log" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.446071 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.447878 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.452129 4892 scope.go:117] "RemoveContainer" containerID="d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118" Oct 06 12:09:07 crc kubenswrapper[4892]: E1006 12:09:07.452443 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.473635 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.495054 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.513756 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.516308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.516384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.516396 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.516416 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.516428 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.520881 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.521003 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.521174 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcnd\" (UniqueName: \"kubernetes.io/projected/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-kube-api-access-qkcnd\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.521250 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.534031 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.566386 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27711f32e30e777763687ce603ea62a83e077ceee52726887645b5a24bd14daa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:04Z\\\",\\\"message\\\":\\\"r removal\\\\nI1006 12:09:04.305154 6150 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1006 12:09:04.305149 6150 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:04.305240 6150 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:04.305215 6150 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:04.305228 6150 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:04.305274 6150 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:04.305312 6150 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:04.305362 6150 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:04.305317 6150 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:04.306828 6150 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1006 12:09:04.306858 6150 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1006 12:09:04.306889 6150 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1006 12:09:04.306943 6150 factory.go:656] Stopping watch factory\\\\nI1006 12:09:04.306967 6150 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:04.307003 6150 handler.go:208] Removed *v1.Node event handler 7\\\\nI1006 12:09:04.307023 6150 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.588378 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.607658 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.620044 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.620170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.620189 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.620215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.620231 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.622786 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.623019 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.623179 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcnd\" (UniqueName: \"kubernetes.io/projected/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-kube-api-access-qkcnd\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.623436 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.623955 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.624278 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.630140 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.631605 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.647107 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcnd\" (UniqueName: \"kubernetes.io/projected/69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2-kube-api-access-qkcnd\") pod \"ovnkube-control-plane-749d76644c-r9zj8\" (UID: \"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.652047 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.677396 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.695401 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.717272 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.722835 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.722877 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.722894 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.722918 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.722935 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.737556 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.753710 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.756581 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.773904 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: W1006 12:09:07.777787 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b3b9c3_f7a1_42cc_9108_e8afc1f27ad2.slice/crio-232c21e89cf2a28bd75f7bf5d2fa13c72e1e47bf2bbbcb2be1bf4923dbe6034e WatchSource:0}: Error finding container 232c21e89cf2a28bd75f7bf5d2fa13c72e1e47bf2bbbcb2be1bf4923dbe6034e: Status 404 returned error can't find the container with id 232c21e89cf2a28bd75f7bf5d2fa13c72e1e47bf2bbbcb2be1bf4923dbe6034e Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.793636 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.808181 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.824574 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.825633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.825708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.825730 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.825761 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.825786 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.840057 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.863785 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.886281 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.904476 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.919389 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.927967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.928003 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.928016 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.928033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.928043 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:07Z","lastTransitionTime":"2025-10-06T12:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.937239 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.949773 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.968531 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.985565 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:07 crc kubenswrapper[4892]: I1006 12:09:07.995595 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:07Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.008034 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.034955 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.034999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.035013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.035034 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.035051 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.044906 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.081545 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.094678 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.136898 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.136936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.136946 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.136961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.136970 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.168240 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.168266 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:08 crc kubenswrapper[4892]: E1006 12:09:08.168397 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:08 crc kubenswrapper[4892]: E1006 12:09:08.168537 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.238683 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.238720 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.238730 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.238746 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.238757 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.342249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.342286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.342297 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.342314 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.342350 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.445568 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.445656 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.445701 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.445734 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.445758 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.458613 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" event={"ID":"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2","Type":"ContainerStarted","Data":"75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.458699 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" event={"ID":"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2","Type":"ContainerStarted","Data":"a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.458727 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" event={"ID":"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2","Type":"ContainerStarted","Data":"232c21e89cf2a28bd75f7bf5d2fa13c72e1e47bf2bbbcb2be1bf4923dbe6034e"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.479204 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.496356 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.520044 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.540748 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.548104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.548168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.548185 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.548210 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.548228 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.561891 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.578287 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.597303 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.613657 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.636737 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.650807 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.650864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.650876 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.650895 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.650910 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.654011 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.675892 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.698226 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.719994 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.737854 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.753639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.753740 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.753758 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.753780 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.753808 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.756225 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.776367 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.857105 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.857434 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.857557 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.857674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.857767 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.960547 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.960588 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.960597 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.960611 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.960622 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:08Z","lastTransitionTime":"2025-10-06T12:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.977564 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-bf88v"] Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.978399 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:08 crc kubenswrapper[4892]: E1006 12:09:08.978505 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:08 crc kubenswrapper[4892]: I1006 12:09:08.992587 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:08Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.010037 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.026141 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.035632 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfgmm\" (UniqueName: \"kubernetes.io/projected/d042dea2-ba2d-4825-a01c-79d5eb2fc912-kube-api-access-kfgmm\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.035763 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.044593 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.061150 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.063354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.063406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.063454 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.063480 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.063496 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.078180 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.095940 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.120434 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.136739 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfgmm\" (UniqueName: \"kubernetes.io/projected/d042dea2-ba2d-4825-a01c-79d5eb2fc912-kube-api-access-kfgmm\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.136861 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.137013 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.137116 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs podName:d042dea2-ba2d-4825-a01c-79d5eb2fc912 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:09.637086791 +0000 UTC m=+36.186792586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs") pod "network-metrics-daemon-bf88v" (UID: "d042dea2-ba2d-4825-a01c-79d5eb2fc912") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.141415 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.162246 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.166501 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfgmm\" (UniqueName: \"kubernetes.io/projected/d042dea2-ba2d-4825-a01c-79d5eb2fc912-kube-api-access-kfgmm\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.167573 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.167742 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.167797 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.167850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.167867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.167896 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.167917 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.196646 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.231001 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.253887 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.270729 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.270799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.270816 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.270842 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.270861 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.277366 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.296602 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.316472 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.336591 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:09Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.374572 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.374636 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.374653 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.375071 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.375128 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.478274 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.478350 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.478368 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.478389 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.478407 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.582126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.582168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.582182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.582206 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.582218 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.643088 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.643305 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.643464 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs podName:d042dea2-ba2d-4825-a01c-79d5eb2fc912 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:10.643433941 +0000 UTC m=+37.193139746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs") pod "network-metrics-daemon-bf88v" (UID: "d042dea2-ba2d-4825-a01c-79d5eb2fc912") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.686001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.686062 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.686077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.686102 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.686120 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.788835 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.788902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.788934 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.788965 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.788986 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.845501 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.845728 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:09:25.845696592 +0000 UTC m=+52.395402387 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.893071 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.893566 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.893585 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.893612 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.893633 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.947235 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.947319 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.947603 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.947707 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947746 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947833 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947855 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947861 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947747 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947963 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947985 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947782 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.947927 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:25.947904732 +0000 UTC m=+52.497610527 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.948160 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:25.948136869 +0000 UTC m=+52.497842664 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.948200 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:25.94818446 +0000 UTC m=+52.497890265 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:09 crc kubenswrapper[4892]: E1006 12:09:09.948232 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:25.948219461 +0000 UTC m=+52.497925266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.996353 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.996403 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.996482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.996501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:09 crc kubenswrapper[4892]: I1006 12:09:09.996526 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:09Z","lastTransitionTime":"2025-10-06T12:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.100767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.100915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.100996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.101077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.101107 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.168438 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.168536 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.168711 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.168921 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.204228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.204286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.204303 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.204360 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.204386 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.308199 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.308260 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.308278 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.308303 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.308341 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.411850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.411899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.411911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.411931 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.411944 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.514617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.514746 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.514766 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.514795 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.514813 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.618144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.618217 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.618231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.618251 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.618266 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.655902 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.656027 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.656084 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs podName:d042dea2-ba2d-4825-a01c-79d5eb2fc912 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:12.65607076 +0000 UTC m=+39.205776515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs") pod "network-metrics-daemon-bf88v" (UID: "d042dea2-ba2d-4825-a01c-79d5eb2fc912") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.720620 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.720696 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.720727 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.720759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.720782 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.741500 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.741574 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.741597 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.741623 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.741649 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.761569 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:10Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.766767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.766832 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.766855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.766886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.766911 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.788600 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:10Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.794142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.794233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.794262 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.794296 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.794356 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.810857 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:10Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.816008 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.816080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.816100 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.816127 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.816144 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.835221 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:10Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.841022 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.841086 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.841095 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.841115 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.841127 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.872054 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:10Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:10 crc kubenswrapper[4892]: E1006 12:09:10.872421 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.875538 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.875621 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.875642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.875671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.875694 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.978827 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.978900 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.978923 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.978954 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:10 crc kubenswrapper[4892]: I1006 12:09:10.978975 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:10Z","lastTransitionTime":"2025-10-06T12:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.081740 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.081805 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.081822 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.081847 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.081866 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.167926 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.167962 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:11 crc kubenswrapper[4892]: E1006 12:09:11.168119 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:11 crc kubenswrapper[4892]: E1006 12:09:11.168410 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.185106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.185165 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.185182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.185203 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.185221 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.288766 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.288821 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.288838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.288862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.288880 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.391862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.391936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.391954 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.391978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.391997 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.495800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.495949 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.496062 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.496095 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.496134 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.598922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.599016 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.599066 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.599090 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.599108 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.702315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.702427 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.702448 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.702474 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.702491 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.805463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.805533 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.805550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.805575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.805595 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.908553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.908617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.908633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.908657 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:11 crc kubenswrapper[4892]: I1006 12:09:11.908674 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:11Z","lastTransitionTime":"2025-10-06T12:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.011632 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.011693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.011713 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.011735 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.011752 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.115202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.115263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.115282 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.115307 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.115364 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.167922 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:12 crc kubenswrapper[4892]: E1006 12:09:12.168257 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.168382 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:12 crc kubenswrapper[4892]: E1006 12:09:12.168583 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.218996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.219048 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.219067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.219092 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.219111 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.322288 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.322414 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.322432 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.322459 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.322478 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.426027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.426089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.426152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.426179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.426198 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.529234 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.529293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.529305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.529349 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.529363 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.632549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.632653 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.632678 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.632706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.632729 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.678720 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:12 crc kubenswrapper[4892]: E1006 12:09:12.678987 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:12 crc kubenswrapper[4892]: E1006 12:09:12.679115 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs podName:d042dea2-ba2d-4825-a01c-79d5eb2fc912 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:16.679082632 +0000 UTC m=+43.228788427 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs") pod "network-metrics-daemon-bf88v" (UID: "d042dea2-ba2d-4825-a01c-79d5eb2fc912") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.735627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.735681 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.735698 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.735723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.735740 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.838715 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.838788 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.838811 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.838839 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.838859 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.942149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.942213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.942229 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.942253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:12 crc kubenswrapper[4892]: I1006 12:09:12.942273 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:12Z","lastTransitionTime":"2025-10-06T12:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.046059 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.046137 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.046157 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.046180 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.046199 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.149852 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.149929 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.149948 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.149973 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.149993 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.168266 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.168277 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:13 crc kubenswrapper[4892]: E1006 12:09:13.168555 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:13 crc kubenswrapper[4892]: E1006 12:09:13.168715 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.253481 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.253561 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.253587 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.253618 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.253641 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.357237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.357391 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.357420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.357484 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.357589 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.460796 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.460854 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.460875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.460907 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.460946 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.565319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.565420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.565440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.565467 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.565485 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.668713 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.668771 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.668783 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.668801 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.668812 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.771613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.771673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.771691 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.771718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.771736 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.874454 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.874552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.874569 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.874591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.874604 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.976617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.976668 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.976680 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.976699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:13 crc kubenswrapper[4892]: I1006 12:09:13.976711 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:13Z","lastTransitionTime":"2025-10-06T12:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.079300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.079442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.079463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.079483 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.079498 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.168510 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:14 crc kubenswrapper[4892]: E1006 12:09:14.168676 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.168777 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:14 crc kubenswrapper[4892]: E1006 12:09:14.169102 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.181273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.181305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.181315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.181349 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.181361 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.188092 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.205919 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.220151 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.239936 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.268871 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.283875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.283925 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.283934 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.283951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.283937 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.283964 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.300034 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.320392 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.337393 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.351555 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.365195 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.383076 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.387303 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.387384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.387398 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.387421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.387437 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.399814 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.414659 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.432083 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.449551 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.469273 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:14Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.489553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.489628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.489644 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.489662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.489700 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.592145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.592243 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.592287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.592315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.592372 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.696231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.696315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.696377 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.696420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.696458 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.799781 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.800243 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.800450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.800619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.800800 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.903852 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.903936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.903971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.904002 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:14 crc kubenswrapper[4892]: I1006 12:09:14.904026 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:14Z","lastTransitionTime":"2025-10-06T12:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.007388 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.007464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.007495 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.007523 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.007543 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.110399 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.110464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.110488 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.110517 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.110540 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.168375 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.168380 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:15 crc kubenswrapper[4892]: E1006 12:09:15.168563 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:15 crc kubenswrapper[4892]: E1006 12:09:15.168702 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.213706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.213774 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.213792 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.213818 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.213837 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.317484 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.317553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.317571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.317596 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.317616 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.420053 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.420107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.420125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.420152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.420169 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.523046 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.523110 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.523131 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.523171 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.523200 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.625496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.625572 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.625590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.625615 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.625631 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.728465 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.728517 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.728528 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.728720 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.728730 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.831977 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.832049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.832061 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.832082 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.832093 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.934629 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.934712 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.934740 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.934771 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:15 crc kubenswrapper[4892]: I1006 12:09:15.934795 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:15Z","lastTransitionTime":"2025-10-06T12:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.038487 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.038547 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.038565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.038590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.038607 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.143718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.143796 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.143825 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.143859 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.143883 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.168637 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.168636 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:16 crc kubenswrapper[4892]: E1006 12:09:16.168874 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:16 crc kubenswrapper[4892]: E1006 12:09:16.169006 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.247449 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.247504 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.247520 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.247545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.247562 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.350308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.350659 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.350778 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.350904 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.351010 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.454505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.454561 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.454577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.454599 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.454617 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.557963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.558018 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.558034 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.558060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.558077 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.661643 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.661979 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.662140 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.662287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.662495 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.724653 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:16 crc kubenswrapper[4892]: E1006 12:09:16.724822 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:16 crc kubenswrapper[4892]: E1006 12:09:16.725134 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs podName:d042dea2-ba2d-4825-a01c-79d5eb2fc912 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:24.725107357 +0000 UTC m=+51.274813152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs") pod "network-metrics-daemon-bf88v" (UID: "d042dea2-ba2d-4825-a01c-79d5eb2fc912") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.766125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.766209 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.766226 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.766253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.766276 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.869637 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.869687 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.869697 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.869719 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.869739 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.972992 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.973374 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.973488 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.973596 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:16 crc kubenswrapper[4892]: I1006 12:09:16.973690 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:16Z","lastTransitionTime":"2025-10-06T12:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.077263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.077359 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.077378 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.077402 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.077424 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.167566 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.167586 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:17 crc kubenswrapper[4892]: E1006 12:09:17.168281 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:17 crc kubenswrapper[4892]: E1006 12:09:17.168392 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.180623 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.180694 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.180716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.180747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.180770 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.284463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.284575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.284593 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.284618 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.284636 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.387794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.387838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.387855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.387921 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.387938 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.491079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.491439 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.491487 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.491520 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.491543 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.594754 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.594818 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.594836 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.594859 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.594876 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.698098 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.698180 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.698203 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.698233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.698255 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.800899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.800955 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.800974 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.800998 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.801014 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.903631 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.903697 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.903716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.903741 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:17 crc kubenswrapper[4892]: I1006 12:09:17.903758 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:17Z","lastTransitionTime":"2025-10-06T12:09:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.006841 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.006927 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.006949 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.006980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.007001 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.110038 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.110099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.110119 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.110148 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.110169 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.168036 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.168276 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:18 crc kubenswrapper[4892]: E1006 12:09:18.168505 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:18 crc kubenswrapper[4892]: E1006 12:09:18.168602 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.168941 4892 scope.go:117] "RemoveContainer" containerID="8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.213774 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.213838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.213858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.213883 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.213900 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.316747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.316811 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.316829 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.316853 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.316870 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.419926 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.419983 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.420006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.420032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.420053 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.500104 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.502637 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.503181 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.516398 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.523083 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.523121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.523153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.523173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.523186 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.532029 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.547355 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.561812 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.575888 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.588037 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.602200 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.620413 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.625228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.625450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.625602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.625735 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.625868 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.651591 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.674846 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.695217 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.713506 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.729411 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.729468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.729486 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.729511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.729529 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.733106 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.750355 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.769224 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.784852 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.810269 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:18Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.832045 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.832125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.832150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.832183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.832210 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.935303 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.935382 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.935395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.935417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:18 crc kubenswrapper[4892]: I1006 12:09:18.935433 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:18Z","lastTransitionTime":"2025-10-06T12:09:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.038612 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.038673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.038690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.038714 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.038732 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.142318 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.142442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.142464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.142491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.142510 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.168471 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.168579 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:19 crc kubenswrapper[4892]: E1006 12:09:19.168675 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:19 crc kubenswrapper[4892]: E1006 12:09:19.168801 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.246198 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.246250 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.246267 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.246291 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.246310 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.350249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.350726 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.350786 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.350817 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.350844 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.454857 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.454907 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.454926 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.454956 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.454975 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.557651 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.557707 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.557724 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.557747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.557763 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.666473 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.666563 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.666590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.667112 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.667439 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.769670 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.769703 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.769711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.769723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.769732 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.872804 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.872872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.872894 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.872922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.872944 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.976726 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.977067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.977242 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.977446 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:19 crc kubenswrapper[4892]: I1006 12:09:19.977581 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:19Z","lastTransitionTime":"2025-10-06T12:09:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.082050 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.082416 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.082597 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.082776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.082926 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.168166 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.168252 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:20 crc kubenswrapper[4892]: E1006 12:09:20.168369 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:20 crc kubenswrapper[4892]: E1006 12:09:20.168422 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.185139 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.185185 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.185199 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.185219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.185236 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.288786 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.288844 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.288863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.288887 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.288905 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.392171 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.392231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.392249 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.392273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.392292 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.495660 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.495734 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.495758 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.495788 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.495811 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.598239 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.598368 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.598402 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.598434 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.598458 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.701466 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.701524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.701541 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.701565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.701580 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.805067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.805135 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.805152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.805179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.805202 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.908095 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.908157 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.908174 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.908200 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:20 crc kubenswrapper[4892]: I1006 12:09:20.908218 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:20Z","lastTransitionTime":"2025-10-06T12:09:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.011546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.011593 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.011610 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.011632 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.011649 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.113568 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.113601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.113610 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.113624 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.113634 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.167694 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.167792 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:21 crc kubenswrapper[4892]: E1006 12:09:21.168005 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:21 crc kubenswrapper[4892]: E1006 12:09:21.168142 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.169375 4892 scope.go:117] "RemoveContainer" containerID="d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.169769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.169857 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.169889 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.170463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.170543 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: E1006 12:09:21.192347 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.198167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.198472 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.198675 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.198867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.199050 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: E1006 12:09:21.221412 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.227071 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.227130 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.227158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.227183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.227200 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: E1006 12:09:21.251832 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.259367 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.259435 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.259455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.259480 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.259505 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: E1006 12:09:21.282658 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.288464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.288540 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.288556 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.288575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.288591 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: E1006 12:09:21.308907 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: E1006 12:09:21.309091 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.310566 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.310590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.310600 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.310616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.310627 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.413489 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.413542 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.413559 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.413583 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.413602 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.515936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.516005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.516031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.516063 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.516088 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.519440 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/1.log" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.523960 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.524760 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.543263 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.581204 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.618273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.618307 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.618335 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.618354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.618367 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.618503 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.653078 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.680063 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.695375 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.710002 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.721186 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.721220 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.721228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.721244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.721253 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.723292 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.735499 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.747665 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.756435 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.766733 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.777555 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.786510 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.797881 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.810596 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.821352 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:21Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.822863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.822892 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.822903 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.822919 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.822931 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.924813 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.924847 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.924855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.924867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:21 crc kubenswrapper[4892]: I1006 12:09:21.924876 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:21Z","lastTransitionTime":"2025-10-06T12:09:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.027275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.027305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.027314 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.027358 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.027373 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.129839 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.129896 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.129912 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.129937 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.129956 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.168676 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.168831 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:22 crc kubenswrapper[4892]: E1006 12:09:22.168884 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:22 crc kubenswrapper[4892]: E1006 12:09:22.169034 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.233018 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.233095 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.233120 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.233153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.233179 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.336171 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.336245 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.336263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.336291 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.336309 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.438790 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.438858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.438874 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.438901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.438919 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.531293 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/2.log" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.532468 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/1.log" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.537525 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561" exitCode=1 Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.537595 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.537645 4892 scope.go:117] "RemoveContainer" containerID="d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.539589 4892 scope.go:117] "RemoveContainer" containerID="eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561" Oct 06 12:09:22 crc kubenswrapper[4892]: E1006 12:09:22.540235 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.541264 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.541354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.541373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.541398 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.541415 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.562546 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.585939 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.606943 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.625266 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.644673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.644738 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.644758 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.644784 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.644802 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.645442 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.667596 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.684801 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.718184 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.740539 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.748072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.748149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.748193 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.748226 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.748248 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.761156 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.784573 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.815297 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5d8400dc89e29247535877fe340193eeb3782e73f66f8d0cd9dcf6d00b16118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:05Z\\\",\\\"message\\\":\\\"ift-operator-lifecycle-manager/package-server-manager-metrics\\\\\\\"}\\\\nI1006 12:09:05.474485 6307 services_controller.go:360] Finished syncing service package-server-manager-metrics on namespace openshift-operator-lifecycle-manager for network=default : 1.130412ms\\\\nI1006 12:09:05.474738 6307 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI1006 12:09:05.474779 6307 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1006 12:09:05.474805 6307 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1006 12:09:05.474859 6307 factory.go:1336] Added *v1.Node event handler 7\\\\nI1006 12:09:05.474909 6307 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1006 12:09:05.475272 6307 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1006 12:09:05.475425 6307 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1006 12:09:05.475469 6307 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:05.475496 6307 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:05.475551 6307 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.836718 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.851658 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.851727 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.851745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.851776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.851797 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.858829 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.880634 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.899768 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.924218 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:22Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.955279 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.955373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.955394 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.955423 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:22 crc kubenswrapper[4892]: I1006 12:09:22.955443 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:22Z","lastTransitionTime":"2025-10-06T12:09:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.059223 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.059287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.059311 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.059390 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.059417 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.162518 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.162598 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.162619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.162654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.162676 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.167821 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.167858 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:23 crc kubenswrapper[4892]: E1006 12:09:23.168006 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:23 crc kubenswrapper[4892]: E1006 12:09:23.168190 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.265554 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.265621 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.265639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.265673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.265692 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.369153 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.369272 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.369304 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.369377 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.369411 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.472887 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.472980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.473001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.473028 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.473047 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.554438 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/2.log" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.561808 4892 scope.go:117] "RemoveContainer" containerID="eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561" Oct 06 12:09:23 crc kubenswrapper[4892]: E1006 12:09:23.562171 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.576723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.576811 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.576835 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.576866 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.576888 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.586492 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.606763 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.626106 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.641532 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.658631 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.676570 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.680006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.680051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.680065 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.680089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.680104 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.694745 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.726171 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.748685 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.767240 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.782887 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.782935 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.782946 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.782966 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.782981 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.788388 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.811940 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.833750 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.855038 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.875815 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.887130 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.887200 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.887219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.887245 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.887269 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.892105 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.915275 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:23Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.991012 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.991077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.991094 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.991117 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:23 crc kubenswrapper[4892]: I1006 12:09:23.991138 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:23Z","lastTransitionTime":"2025-10-06T12:09:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.093941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.093997 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.094014 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.094040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.094058 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.167625 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.167785 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:24 crc kubenswrapper[4892]: E1006 12:09:24.168048 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:24 crc kubenswrapper[4892]: E1006 12:09:24.168159 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.185932 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.197483 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.197558 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.197571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.197589 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.197604 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.199497 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.214386 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.232182 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.248943 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.266779 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.292374 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.300771 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.300814 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.300826 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.300845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.300886 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.325417 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.354237 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.370996 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.392279 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.404519 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.404581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.404601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.404632 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.404651 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.421910 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.442002 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.457930 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.484629 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.505988 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.508643 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.508719 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.508745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.508775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.508797 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.528480 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:24Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.611778 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.611824 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.611836 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.611853 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.611864 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.714831 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.714888 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.714905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.714928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.714958 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.814232 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:24 crc kubenswrapper[4892]: E1006 12:09:24.814515 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:24 crc kubenswrapper[4892]: E1006 12:09:24.814634 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs podName:d042dea2-ba2d-4825-a01c-79d5eb2fc912 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:40.814599315 +0000 UTC m=+67.364305150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs") pod "network-metrics-daemon-bf88v" (UID: "d042dea2-ba2d-4825-a01c-79d5eb2fc912") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.817296 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.817397 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.817420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.817455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.817477 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.920561 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.920624 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.920642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.920666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:24 crc kubenswrapper[4892]: I1006 12:09:24.920686 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:24Z","lastTransitionTime":"2025-10-06T12:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.023455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.023532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.023552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.023575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.023591 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.126149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.126216 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.126238 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.126268 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.126291 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.168407 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.168419 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:25 crc kubenswrapper[4892]: E1006 12:09:25.168936 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:25 crc kubenswrapper[4892]: E1006 12:09:25.169078 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.229561 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.229633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.229654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.229691 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.229713 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.333204 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.333270 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.333287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.333313 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.333372 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.436225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.436286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.436304 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.436374 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.436400 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.539712 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.539772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.539788 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.539811 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.539829 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.643176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.643236 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.643253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.643276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.643292 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.746150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.746563 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.746779 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.746991 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.747171 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.803241 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.821808 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.827515 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.846492 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.850862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.850922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.850941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.850963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.850979 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.865457 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.881246 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.894681 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.908681 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.924728 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.928751 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:09:25 crc kubenswrapper[4892]: E1006 12:09:25.928938 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:09:57.928902157 +0000 UTC m=+84.478607962 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.944417 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.954067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.954131 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.954149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.954176 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.954194 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:25Z","lastTransitionTime":"2025-10-06T12:09:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.962968 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:25 crc kubenswrapper[4892]: I1006 12:09:25.979598 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:25Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.007084 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:26Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.030349 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.030397 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.030431 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.030479 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030574 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030629 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030657 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030668 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030685 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030698 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030743 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:58.030711357 +0000 UTC m=+84.580417192 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030764 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030779 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:58.030761408 +0000 UTC m=+84.580467213 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030603 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030814 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:58.030794169 +0000 UTC m=+84.580500114 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.030844 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:09:58.03082863 +0000 UTC m=+84.580534525 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.039572 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:26Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.052499 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:26Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.056649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.056688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.056704 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.056725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.056763 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.073604 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:26Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.090182 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:26Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.109093 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:26Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.124520 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:26Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.158858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.158913 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.158933 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.158956 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.158975 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.168249 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.168275 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.168464 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:26 crc kubenswrapper[4892]: E1006 12:09:26.168585 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.262233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.262287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.262367 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.262389 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.262407 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.366936 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.367305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.367378 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.367403 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.367422 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.469818 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.469868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.469879 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.469896 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.469910 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.572695 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.572745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.572762 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.572782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.572799 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.675704 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.676320 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.676378 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.676431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.676448 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.779365 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.779413 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.779432 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.779455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.779473 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.882363 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.882421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.882437 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.882460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.882479 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.985317 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.985376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.985385 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.985400 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:26 crc kubenswrapper[4892]: I1006 12:09:26.985409 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:26Z","lastTransitionTime":"2025-10-06T12:09:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.088074 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.088159 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.088180 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.088206 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.088225 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.168015 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.168127 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:27 crc kubenswrapper[4892]: E1006 12:09:27.168208 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:27 crc kubenswrapper[4892]: E1006 12:09:27.168367 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.191765 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.191822 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.191839 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.191863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.191879 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.299905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.299976 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.299997 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.300022 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.300042 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.403230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.403294 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.403313 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.403365 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.403384 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.506517 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.506578 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.506594 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.506619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.506641 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.610125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.610183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.610201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.610224 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.610244 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.713127 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.713186 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.713206 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.713231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.713249 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.816494 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.816559 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.816577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.816599 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.816616 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.919996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.920050 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.920067 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.920088 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:27 crc kubenswrapper[4892]: I1006 12:09:27.920109 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:27Z","lastTransitionTime":"2025-10-06T12:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.022775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.022862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.022884 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.022911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.022934 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.126520 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.126598 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.126619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.126647 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.126667 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.168189 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:28 crc kubenswrapper[4892]: E1006 12:09:28.168436 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.168537 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:28 crc kubenswrapper[4892]: E1006 12:09:28.168739 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.229460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.229523 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.229540 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.229566 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.229585 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.332615 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.332671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.332688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.332711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.332728 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.435192 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.435251 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.435270 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.435294 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.435311 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.538240 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.538297 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.538371 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.538408 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.538428 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.641408 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.641469 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.641489 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.641514 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.641531 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.744449 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.744550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.744571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.744596 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.744613 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.847857 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.847918 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.847940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.847963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.847983 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.951262 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.951361 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.951385 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.951414 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:28 crc kubenswrapper[4892]: I1006 12:09:28.951434 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:28Z","lastTransitionTime":"2025-10-06T12:09:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.054622 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.054681 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.054697 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.054721 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.054739 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.157780 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.157854 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.157877 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.157907 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.157930 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.167624 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.167657 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:29 crc kubenswrapper[4892]: E1006 12:09:29.167728 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:29 crc kubenswrapper[4892]: E1006 12:09:29.167872 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.261462 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.261527 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.261554 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.261583 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.261607 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.365391 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.365450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.365465 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.365491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.365508 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.469236 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.469301 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.469319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.469377 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.469394 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.572617 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.572671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.572687 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.572711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.572728 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.675855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.675921 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.675943 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.675972 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.675993 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.779059 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.779121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.779139 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.779163 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.779180 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.884071 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.884127 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.884142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.884163 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.884181 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.987498 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.987583 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.987609 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.987645 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:29 crc kubenswrapper[4892]: I1006 12:09:29.987667 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:29Z","lastTransitionTime":"2025-10-06T12:09:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.090762 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.090826 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.090845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.090867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.090884 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.167891 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.167964 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:30 crc kubenswrapper[4892]: E1006 12:09:30.168077 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:30 crc kubenswrapper[4892]: E1006 12:09:30.168221 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.192544 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.192605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.192625 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.192649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.192668 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.295130 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.295229 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.295261 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.295286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.295304 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.398838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.398929 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.398945 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.398968 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.398986 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.501731 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.501787 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.501804 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.501827 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.501844 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.605644 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.605709 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.605725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.605749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.605767 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.708855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.708897 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.708908 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.708924 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.708936 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.811583 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.811638 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.811654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.811678 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.811696 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.914943 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.915009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.915027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.915055 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:30 crc kubenswrapper[4892]: I1006 12:09:30.915073 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:30Z","lastTransitionTime":"2025-10-06T12:09:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.017899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.017950 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.017965 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.017997 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.018014 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.120890 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.120992 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.121010 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.121038 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.121057 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.168585 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.168594 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:31 crc kubenswrapper[4892]: E1006 12:09:31.168727 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:31 crc kubenswrapper[4892]: E1006 12:09:31.168822 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.224099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.224152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.224166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.224185 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.224199 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.327389 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.327442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.327453 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.327477 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.327493 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.413380 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.413448 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.413460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.413485 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.413500 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: E1006 12:09:31.428275 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:31Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.433383 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.433442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.433463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.433488 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.433505 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: E1006 12:09:31.445872 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:31Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.450802 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.450878 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.450903 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.450934 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.450957 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: E1006 12:09:31.464785 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:31Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.469194 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.469232 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.469244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.469262 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.469274 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: E1006 12:09:31.485502 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:31Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.489833 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.490158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.490242 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.490366 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.490461 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: E1006 12:09:31.506996 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:31Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:31 crc kubenswrapper[4892]: E1006 12:09:31.507294 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.508757 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.508799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.508813 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.508828 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.508840 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.611420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.612453 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.612545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.612581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.612606 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.715590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.715649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.715666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.715691 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.715709 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.818709 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.818767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.818784 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.818808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.818828 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.921792 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.921868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.921891 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.921921 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:31 crc kubenswrapper[4892]: I1006 12:09:31.921942 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:31Z","lastTransitionTime":"2025-10-06T12:09:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.025308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.025392 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.025409 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.025431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.025448 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.037866 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.056109 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.078787 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.097748 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.119703 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.127901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.127975 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.128001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.128032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.128056 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.138809 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.154388 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.168070 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.168119 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:32 crc kubenswrapper[4892]: E1006 12:09:32.168475 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:32 crc kubenswrapper[4892]: E1006 12:09:32.168604 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.173301 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.191796 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.207218 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.225641 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.230556 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.230609 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.230629 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.230650 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.230668 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.247912 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.267062 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.289630 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.309598 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.331040 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.333518 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.333598 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.333623 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.333654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.333676 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.364878 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.384437 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.419478 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:32Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.436619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.436664 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.436681 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.436703 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.436720 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.539431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.539487 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.539505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.539528 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.539545 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.642623 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.642703 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.642737 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.642767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.642791 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.745672 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.745753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.745788 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.745818 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.745839 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.848694 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.848762 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.848791 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.848858 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.848879 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.951642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.951703 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.951720 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.951741 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:32 crc kubenswrapper[4892]: I1006 12:09:32.951758 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:32Z","lastTransitionTime":"2025-10-06T12:09:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.055461 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.055529 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.055545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.055571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.055588 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.159228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.159289 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.159306 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.159370 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.159390 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.167611 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.167644 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:33 crc kubenswrapper[4892]: E1006 12:09:33.167789 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:33 crc kubenswrapper[4892]: E1006 12:09:33.167899 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.262359 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.262410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.262427 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.262449 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.262465 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.365373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.365452 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.365477 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.365502 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.365519 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.468510 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.468566 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.468582 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.468604 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.468621 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.572136 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.572208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.572227 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.572253 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.572272 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.675404 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.675474 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.675496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.675524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.675547 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.778941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.779057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.779077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.779108 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.779136 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.882550 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.882638 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.882658 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.882712 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.882731 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.985779 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.985844 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.985861 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.985885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:33 crc kubenswrapper[4892]: I1006 12:09:33.985903 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:33Z","lastTransitionTime":"2025-10-06T12:09:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.090224 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.090369 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.090392 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.090422 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.090441 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.167618 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.167695 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:34 crc kubenswrapper[4892]: E1006 12:09:34.167812 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:34 crc kubenswrapper[4892]: E1006 12:09:34.167943 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.188129 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.192953 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.193006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.193024 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.193049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.193072 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.204228 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.223919 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.248144 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.264318 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.281292 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.296346 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.296563 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.296696 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.296824 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.296944 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.300480 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.319828 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.335487 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.351076 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.365758 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.379785 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.395556 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.399699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.399768 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.399791 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.399818 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.399837 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.413110 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.430370 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.459400 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.474921 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.502468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.502698 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.502809 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.502920 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.503032 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.504122 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:34Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.605844 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.606226 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.606445 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.606623 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.606763 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.709939 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.710019 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.710042 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.710071 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.710087 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.812706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.812768 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.812784 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.812807 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.812823 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.916115 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.916175 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.916189 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.916216 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:34 crc kubenswrapper[4892]: I1006 12:09:34.916233 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:34Z","lastTransitionTime":"2025-10-06T12:09:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.019502 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.019556 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.019568 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.019587 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.019600 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.122173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.122242 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.122263 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.122288 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.122305 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.168118 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.168193 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:35 crc kubenswrapper[4892]: E1006 12:09:35.168245 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:35 crc kubenswrapper[4892]: E1006 12:09:35.168493 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.225689 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.225743 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.225759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.225782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.225800 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.328783 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.329157 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.329376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.329902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.330054 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.433190 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.433258 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.433291 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.433319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.433368 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.536926 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.536989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.537006 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.537030 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.537047 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.639675 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.639733 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.639749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.639769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.639780 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.742922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.743007 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.743026 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.743605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.743660 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.846366 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.846419 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.846435 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.846456 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.846474 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.949297 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.949391 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.949409 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.949436 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:35 crc kubenswrapper[4892]: I1006 12:09:35.949462 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:35Z","lastTransitionTime":"2025-10-06T12:09:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.053051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.053102 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.053120 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.053143 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.053159 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.156049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.156515 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.156538 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.156562 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.156580 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.168389 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.168464 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:36 crc kubenswrapper[4892]: E1006 12:09:36.168558 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:36 crc kubenswrapper[4892]: E1006 12:09:36.168637 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.260224 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.260596 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.260750 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.260914 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.261055 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.364826 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.364884 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.364902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.364928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.364964 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.468913 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.468973 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.468990 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.469013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.469031 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.572052 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.572126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.572144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.572167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.572184 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.676128 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.676189 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.676205 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.676228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.676246 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.780001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.780051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.780066 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.780090 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.780106 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.882719 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.882782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.882799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.882822 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.882843 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.985952 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.986033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.986064 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.986095 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:36 crc kubenswrapper[4892]: I1006 12:09:36.986112 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:36Z","lastTransitionTime":"2025-10-06T12:09:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.088416 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.088505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.088525 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.088549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.088569 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.168235 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.168248 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:37 crc kubenswrapper[4892]: E1006 12:09:37.168530 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:37 crc kubenswrapper[4892]: E1006 12:09:37.168808 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.192090 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.192154 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.192173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.192205 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.192222 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.295683 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.295932 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.295953 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.295978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.296037 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.398811 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.398867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.398885 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.398912 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.398934 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.502197 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.502282 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.502299 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.502349 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.502366 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.605906 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.605974 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.605993 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.606018 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.606035 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.708403 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.708460 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.708478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.708502 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.708520 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.811513 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.811580 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.811596 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.811625 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.811643 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.914141 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.914220 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.914236 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.914259 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:37 crc kubenswrapper[4892]: I1006 12:09:37.914275 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:37Z","lastTransitionTime":"2025-10-06T12:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.017676 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.017731 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.017747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.017771 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.017789 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.121248 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.121312 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.121343 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.121364 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.121387 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.167829 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.167919 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:38 crc kubenswrapper[4892]: E1006 12:09:38.168588 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.168865 4892 scope.go:117] "RemoveContainer" containerID="eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561" Oct 06 12:09:38 crc kubenswrapper[4892]: E1006 12:09:38.169069 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:09:38 crc kubenswrapper[4892]: E1006 12:09:38.169176 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.223662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.223709 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.223720 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.223740 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.223753 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.325603 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.325927 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.326075 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.326228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.326413 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.429589 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.429953 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.430114 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.430261 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.430422 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.532348 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.532374 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.532381 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.532393 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.532402 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.634443 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.634495 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.634511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.634532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.634547 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.737709 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.737744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.737752 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.737767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.737778 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.840004 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.840736 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.840850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.840952 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.841042 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.944256 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.944701 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.944931 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.945134 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:38 crc kubenswrapper[4892]: I1006 12:09:38.945368 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:38Z","lastTransitionTime":"2025-10-06T12:09:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.048444 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.048507 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.048524 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.048549 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.048567 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.151477 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.151523 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.151539 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.151561 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.151576 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.168318 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.168311 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:39 crc kubenswrapper[4892]: E1006 12:09:39.168500 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:39 crc kubenswrapper[4892]: E1006 12:09:39.168672 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.254378 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.254437 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.254458 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.254481 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.254497 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.357747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.358177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.358438 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.358647 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.358875 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.461993 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.462039 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.462056 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.462077 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.462093 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.564690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.564725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.564735 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.564747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.564756 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.667544 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.667604 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.667627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.667657 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.667679 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.770815 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.770859 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.770869 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.770883 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.770895 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.873852 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.873878 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.873886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.873902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.873914 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.976078 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.976178 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.976195 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.976218 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:39 crc kubenswrapper[4892]: I1006 12:09:39.976236 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:39Z","lastTransitionTime":"2025-10-06T12:09:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.079045 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.079116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.079140 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.079168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.079191 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.168403 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:40 crc kubenswrapper[4892]: E1006 12:09:40.168618 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.168695 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:40 crc kubenswrapper[4892]: E1006 12:09:40.168876 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.181160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.181192 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.181202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.181213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.181222 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.284246 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.284366 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.284385 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.284410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.284426 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.387247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.387374 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.387408 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.387443 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.387466 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.489611 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.489667 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.489683 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.489706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.489725 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.592650 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.592707 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.592725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.592772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.592790 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.699526 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.699572 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.699586 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.699604 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.699616 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.802721 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.802776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.802793 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.802817 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.802834 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.893779 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:40 crc kubenswrapper[4892]: E1006 12:09:40.893980 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:40 crc kubenswrapper[4892]: E1006 12:09:40.894088 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs podName:d042dea2-ba2d-4825-a01c-79d5eb2fc912 nodeName:}" failed. No retries permitted until 2025-10-06 12:10:12.894059264 +0000 UTC m=+99.443765059 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs") pod "network-metrics-daemon-bf88v" (UID: "d042dea2-ba2d-4825-a01c-79d5eb2fc912") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.905552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.905613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.905631 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.905654 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:40 crc kubenswrapper[4892]: I1006 12:09:40.905672 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:40Z","lastTransitionTime":"2025-10-06T12:09:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.007074 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.007130 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.007147 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.007169 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.007187 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.109311 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.109457 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.109475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.109502 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.109519 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.168023 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.168143 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:41 crc kubenswrapper[4892]: E1006 12:09:41.168256 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:41 crc kubenswrapper[4892]: E1006 12:09:41.168432 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.184656 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.212428 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.212495 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.212519 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.212548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.212571 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.315384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.315434 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.315456 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.315492 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.315514 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.419029 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.419113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.419136 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.419167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.419189 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.522360 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.522419 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.522442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.522470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.522493 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.556199 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.556256 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.556275 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.556302 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.556319 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: E1006 12:09:41.571838 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:41Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.575822 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.575863 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.575880 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.575901 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.575916 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: E1006 12:09:41.591003 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:41Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.594641 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.594684 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.594700 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.594721 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.594736 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: E1006 12:09:41.611831 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:41Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.616475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.616511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.616530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.616546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.616559 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: E1006 12:09:41.629983 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:41Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.632850 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.632887 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.632905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.632926 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.632940 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: E1006 12:09:41.652759 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:41Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:41 crc kubenswrapper[4892]: E1006 12:09:41.652917 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.654668 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.654712 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.654725 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.654743 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.654755 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.757680 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.757711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.757719 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.757732 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.757742 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.859775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.859835 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.859852 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.859875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.859894 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.962606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.962643 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.962652 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.962666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:41 crc kubenswrapper[4892]: I1006 12:09:41.962694 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:41Z","lastTransitionTime":"2025-10-06T12:09:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.065220 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.065277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.065293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.065317 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.065362 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.167900 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:42 crc kubenswrapper[4892]: E1006 12:09:42.168026 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.168059 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:42 crc kubenswrapper[4892]: E1006 12:09:42.168372 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.168525 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.168581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.168605 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.168632 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.168649 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.271200 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.271276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.271300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.271362 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.271389 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.374062 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.374113 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.374121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.374133 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.374142 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.476713 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.476747 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.476755 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.476770 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.476780 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.579247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.579302 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.579319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.579373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.579396 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.631665 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/0.log" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.631738 4892 generic.go:334] "Generic (PLEG): container finished" podID="df1cea25-4170-457d-b579-2678161d7d53" containerID="7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1" exitCode=1 Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.631779 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zfsp" event={"ID":"df1cea25-4170-457d-b579-2678161d7d53","Type":"ContainerDied","Data":"7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.632309 4892 scope.go:117] "RemoveContainer" containerID="7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.654922 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.677297 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.681950 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.681982 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.681992 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.682010 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.682021 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.698160 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.714859 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.737027 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.754575 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.773717 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.784060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.784093 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.784104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.784122 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.784135 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.789862 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.804019 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.816736 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.831633 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.844227 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.864236 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.874958 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e90ba8e5-3cfd-4f9d-a47b-b6ff52a4dffa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023bdeb58d3b704a5c9ebed84a02077257e3fc1a3de16fe138ca95e7c2bbae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.886033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.886074 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.886085 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.886102 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.886116 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.895858 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.912174 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.925676 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.940870 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"2025-10-06T12:08:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d\\\\n2025-10-06T12:08:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d to /host/opt/cni/bin/\\\\n2025-10-06T12:08:57Z [verbose] multus-daemon started\\\\n2025-10-06T12:08:57Z [verbose] Readiness Indicator file check\\\\n2025-10-06T12:09:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.962515 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:42Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.988304 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.988371 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.988386 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.988406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:42 crc kubenswrapper[4892]: I1006 12:09:42.988420 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:42Z","lastTransitionTime":"2025-10-06T12:09:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.090849 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.090902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.090919 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.090942 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.090959 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.167821 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:43 crc kubenswrapper[4892]: E1006 12:09:43.168156 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.167916 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:43 crc kubenswrapper[4892]: E1006 12:09:43.168381 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.193371 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.193417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.193431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.193449 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.193461 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.296037 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.296300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.296417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.296511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.296646 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.399065 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.399132 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.399142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.399156 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.399166 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.501821 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.501884 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.501902 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.501928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.501945 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.605449 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.605494 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.605509 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.605532 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.605548 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.642961 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/0.log" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.643055 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zfsp" event={"ID":"df1cea25-4170-457d-b579-2678161d7d53","Type":"ContainerStarted","Data":"0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.661906 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.676964 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.692388 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.707851 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.708974 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.709037 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.709061 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.709088 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.709111 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.725818 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.740990 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.754696 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.764976 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e90ba8e5-3cfd-4f9d-a47b-b6ff52a4dffa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023bdeb58d3b704a5c9ebed84a02077257e3fc1a3de16fe138ca95e7c2bbae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.785047 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.798662 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.810416 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.811695 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.811780 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.811790 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.811827 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.811837 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.829491 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"2025-10-06T12:08:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d\\\\n2025-10-06T12:08:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d to /host/opt/cni/bin/\\\\n2025-10-06T12:08:57Z [verbose] multus-daemon started\\\\n2025-10-06T12:08:57Z [verbose] Readiness Indicator file check\\\\n2025-10-06T12:09:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.853397 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.867712 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.881020 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.894595 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.906795 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.914207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.914252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.914271 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.914294 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.914310 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:43Z","lastTransitionTime":"2025-10-06T12:09:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.922773 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:43 crc kubenswrapper[4892]: I1006 12:09:43.936348 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:43Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.017034 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.017086 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.017102 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.017121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.017137 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.119841 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.119878 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.119888 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.119903 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.119913 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.168556 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.168678 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:44 crc kubenswrapper[4892]: E1006 12:09:44.169048 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:44 crc kubenswrapper[4892]: E1006 12:09:44.169072 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.183914 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.197800 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.212175 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.222590 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.222633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.222649 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.222670 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.222683 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.240682 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.275880 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.296716 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.309974 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.323711 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.325134 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.325346 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.325491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.325659 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.325799 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.334997 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.345902 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.354871 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.368697 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.384869 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"2025-10-06T12:08:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d\\\\n2025-10-06T12:08:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d to /host/opt/cni/bin/\\\\n2025-10-06T12:08:57Z [verbose] multus-daemon started\\\\n2025-10-06T12:08:57Z [verbose] Readiness Indicator file check\\\\n2025-10-06T12:09:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.407365 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.420757 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.428085 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.428118 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.428127 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.428139 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.428149 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.434178 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e90ba8e5-3cfd-4f9d-a47b-b6ff52a4dffa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023bdeb58d3b704a5c9ebed84a02077257e3fc1a3de16fe138ca95e7c2bbae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.455685 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.472102 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.485087 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:44Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.531232 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.531289 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.531301 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.531340 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.531357 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.633922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.633963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.633975 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.633991 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.634006 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.736293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.736388 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.736407 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.736432 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.736452 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.838899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.838941 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.838951 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.838966 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.838978 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.941666 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.941730 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.941741 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.941779 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:44 crc kubenswrapper[4892]: I1006 12:09:44.941792 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:44Z","lastTransitionTime":"2025-10-06T12:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.044376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.044431 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.044442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.044456 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.044465 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.148021 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.148101 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.148119 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.148142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.148158 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.168468 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:45 crc kubenswrapper[4892]: E1006 12:09:45.168605 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.168472 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:45 crc kubenswrapper[4892]: E1006 12:09:45.168705 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.251138 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.251197 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.251215 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.251240 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.251258 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.354591 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.354648 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.354665 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.354687 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.354707 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.457539 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.457600 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.457619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.457642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.457659 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.560058 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.560175 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.560190 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.560210 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.560224 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.662662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.662698 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.662707 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.662720 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.662729 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.765074 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.765121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.765129 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.765145 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.765163 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.867157 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.867217 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.867236 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.867261 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.867277 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.969924 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.969996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.970013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.970038 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:45 crc kubenswrapper[4892]: I1006 12:09:45.970057 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:45Z","lastTransitionTime":"2025-10-06T12:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.072123 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.072164 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.072173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.072187 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.072196 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.168146 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.168205 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:46 crc kubenswrapper[4892]: E1006 12:09:46.168351 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:46 crc kubenswrapper[4892]: E1006 12:09:46.168421 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.174344 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.174383 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.174397 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.174415 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.174428 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.277732 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.277785 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.277798 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.277818 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.277830 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.380720 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.380855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.380872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.380894 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.380934 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.483597 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.483639 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.483648 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.483662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.483670 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.585749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.585785 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.585793 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.585807 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.585817 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.688691 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.688781 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.688799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.688823 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.688842 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.790644 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.790729 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.790745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.790771 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.790789 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.893386 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.893464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.893487 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.893520 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.893544 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.997255 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.997376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.997397 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.997422 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:46 crc kubenswrapper[4892]: I1006 12:09:46.997439 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:46Z","lastTransitionTime":"2025-10-06T12:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.099993 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.100080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.100099 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.100121 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.100139 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.168268 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.168296 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:47 crc kubenswrapper[4892]: E1006 12:09:47.168513 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:47 crc kubenswrapper[4892]: E1006 12:09:47.168655 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.202629 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.202692 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.202711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.202736 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.202753 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.305896 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.305928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.305935 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.305948 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.305958 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.408442 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.408511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.408530 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.408557 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.408586 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.512388 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.512455 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.512472 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.512494 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.512511 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.615406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.615461 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.615476 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.615494 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.615508 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.718583 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.718632 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.718645 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.718661 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.718672 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.821243 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.821365 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.821385 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.821412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.821429 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.923875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.923961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.923981 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.924027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:47 crc kubenswrapper[4892]: I1006 12:09:47.924043 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:47Z","lastTransitionTime":"2025-10-06T12:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.026776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.026840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.026856 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.026880 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.026897 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.129026 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.129085 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.129107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.129136 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.129161 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.168001 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.168030 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:48 crc kubenswrapper[4892]: E1006 12:09:48.168129 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:48 crc kubenswrapper[4892]: E1006 12:09:48.168190 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.232693 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.232794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.232819 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.232849 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.232872 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.336072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.336133 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.336150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.336173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.336193 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.439445 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.439499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.439516 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.439539 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.439558 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.542646 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.542718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.542740 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.542768 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.542789 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.647658 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.647716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.647733 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.647751 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.647764 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.750231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.750311 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.750353 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.750827 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.750884 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.854266 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.854355 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.854373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.854396 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.854413 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.957692 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.957738 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.957753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.957779 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:48 crc kubenswrapper[4892]: I1006 12:09:48.957797 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:48Z","lastTransitionTime":"2025-10-06T12:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.060596 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.060665 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.060689 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.060718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.060740 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.162952 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.163001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.163009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.163022 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.163031 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.168254 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.168308 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:49 crc kubenswrapper[4892]: E1006 12:09:49.168409 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:49 crc kubenswrapper[4892]: E1006 12:09:49.168458 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.265284 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.265376 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.265395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.265417 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.265433 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.369071 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.369152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.369177 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.369207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.369231 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.472470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.472534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.472556 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.472584 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.472605 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.575923 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.576000 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.576023 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.576053 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.576074 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.679475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.679531 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.679548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.679570 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.679590 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.782812 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.782879 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.782897 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.782922 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.782940 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.885799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.885911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.885933 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.885960 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.885982 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.988927 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.989005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.989027 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.989056 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:49 crc kubenswrapper[4892]: I1006 12:09:49.989081 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:49Z","lastTransitionTime":"2025-10-06T12:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.092216 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.092268 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.092284 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.092308 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.092373 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.168447 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.168486 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:50 crc kubenswrapper[4892]: E1006 12:09:50.168627 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:50 crc kubenswrapper[4892]: E1006 12:09:50.168903 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.194952 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.195031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.195059 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.195089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.195109 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.298481 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.298554 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.298571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.298592 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.298610 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.401914 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.401985 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.402007 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.402039 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.402057 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.505104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.505166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.505183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.505207 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.505230 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.607998 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.608066 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.608083 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.608106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.608124 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.710634 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.710706 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.710723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.710749 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.710766 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.813886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.813967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.813993 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.814026 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.814049 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.916931 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.916987 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.917003 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.917026 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:50 crc kubenswrapper[4892]: I1006 12:09:50.917044 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:50Z","lastTransitionTime":"2025-10-06T12:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.020107 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.020179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.020201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.020230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.020256 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.123751 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.123832 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.123857 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.123888 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.123914 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.168306 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.168414 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:51 crc kubenswrapper[4892]: E1006 12:09:51.168573 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:51 crc kubenswrapper[4892]: E1006 12:09:51.168753 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.227168 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.227219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.227237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.227260 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.227277 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.330164 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.330261 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.330287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.330314 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.330367 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.433761 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.433818 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.433834 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.433859 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.433878 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.537064 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.537230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.537265 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.537307 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.537364 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.640047 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.640091 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.640103 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.640116 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.640126 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.743144 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.743194 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.743211 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.743234 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.743251 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.846313 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.846412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.846430 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.846461 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.846497 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.900286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.900426 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.900452 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.900484 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.900509 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: E1006 12:09:51.922094 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:51Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.926985 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.927057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.927078 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.927104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.927124 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: E1006 12:09:51.947076 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:51Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.951882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.951917 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.951927 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.951942 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.951953 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: E1006 12:09:51.966810 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:51Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.972242 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.972528 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.972546 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.972570 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.972591 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:51 crc kubenswrapper[4892]: E1006 12:09:51.992802 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:51Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.997246 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.997288 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.997305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.997364 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:51 crc kubenswrapper[4892]: I1006 12:09:51.997388 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:51Z","lastTransitionTime":"2025-10-06T12:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: E1006 12:09:52.012978 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: E1006 12:09:52.013204 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.015247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.015303 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.015361 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.015395 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.015413 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.118826 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.118884 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.118896 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.118918 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.118930 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.168151 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.168235 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:52 crc kubenswrapper[4892]: E1006 12:09:52.168398 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:52 crc kubenswrapper[4892]: E1006 12:09:52.168458 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.169679 4892 scope.go:117] "RemoveContainer" containerID="eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.222001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.222282 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.222478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.222627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.222772 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.326358 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.326437 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.326464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.326496 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.326520 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.428645 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.428686 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.428698 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.428714 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.428727 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.531252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.531438 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.531466 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.531491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.531510 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.634167 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.634225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.634244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.634294 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.634314 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.676516 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/2.log" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.679655 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.680076 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.697004 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.713292 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.731273 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.736161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.736281 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.736382 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.736488 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.736548 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.750155 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.761030 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.772890 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.786969 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.796616 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.811212 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.824155 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.837467 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.838718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.838762 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.838779 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.838802 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.838821 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.850696 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.862225 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e90ba8e5-3cfd-4f9d-a47b-b6ff52a4dffa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023bdeb58d3b704a5c9ebed84a02077257e3fc1a3de16fe138ca95e7c2bbae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.883920 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.896587 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.908175 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.919397 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"2025-10-06T12:08:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d\\\\n2025-10-06T12:08:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d to /host/opt/cni/bin/\\\\n2025-10-06T12:08:57Z [verbose] multus-daemon started\\\\n2025-10-06T12:08:57Z [verbose] Readiness Indicator file check\\\\n2025-10-06T12:09:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.935756 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.941716 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.941768 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.941785 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.941808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.941824 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:52Z","lastTransitionTime":"2025-10-06T12:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:52 crc kubenswrapper[4892]: I1006 12:09:52.946737 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:52Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.043724 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.043778 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.043791 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.043808 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.043821 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.147374 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.147409 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.147420 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.147436 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.147446 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.167777 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.167836 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:53 crc kubenswrapper[4892]: E1006 12:09:53.167876 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:53 crc kubenswrapper[4892]: E1006 12:09:53.167997 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.250501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.250566 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.250587 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.250615 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.250636 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.353892 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.353952 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.353975 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.354008 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.354026 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.457217 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.457305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.457355 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.457385 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.457409 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.560907 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.560977 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.561003 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.561033 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.561061 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.663774 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.663848 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.663871 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.663904 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.663925 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.686304 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/3.log" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.687537 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/2.log" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.692098 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" exitCode=1 Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.692153 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.692204 4892 scope.go:117] "RemoveContainer" containerID="eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.693432 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:09:53 crc kubenswrapper[4892]: E1006 12:09:53.693824 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.718740 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.738660 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.756365 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.766794 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.766840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.766859 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.766882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.766899 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.780175 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.795071 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.812941 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.826565 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.838676 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.852027 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.866178 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.869609 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.869674 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.869697 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.869727 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.869749 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.882349 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.904302 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.917779 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e90ba8e5-3cfd-4f9d-a47b-b6ff52a4dffa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023bdeb58d3b704a5c9ebed84a02077257e3fc1a3de16fe138ca95e7c2bbae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.946171 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.963011 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.973597 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.973659 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.973683 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.973713 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.973734 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:53Z","lastTransitionTime":"2025-10-06T12:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:53 crc kubenswrapper[4892]: I1006 12:09:53.980077 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.002450 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"2025-10-06T12:08:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d\\\\n2025-10-06T12:08:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d to /host/opt/cni/bin/\\\\n2025-10-06T12:08:57Z [verbose] multus-daemon started\\\\n2025-10-06T12:08:57Z [verbose] Readiness Indicator file check\\\\n2025-10-06T12:09:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.026981 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:53Z\\\",\\\"message\\\":\\\"ow:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 12:09:53.058262 6912 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1006 12:09:53.058500 6912 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.043474 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.076191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.076226 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.076243 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.076258 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.076270 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.168101 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.168151 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:54 crc kubenswrapper[4892]: E1006 12:09:54.168213 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:54 crc kubenswrapper[4892]: E1006 12:09:54.168350 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.178820 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.178846 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.178855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.178869 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.178878 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.190114 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"2025-10-06T12:08:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d\\\\n2025-10-06T12:08:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d to /host/opt/cni/bin/\\\\n2025-10-06T12:08:57Z [verbose] multus-daemon started\\\\n2025-10-06T12:08:57Z [verbose] Readiness Indicator file check\\\\n2025-10-06T12:09:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.223146 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eef6c5e94eac8d546236025b43ad72f10d6436ceecf6bd7775aa13462640c561\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:22Z\\\",\\\"message\\\":\\\"22.092244 6552 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1006 12:09:22.092267 6552 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1006 12:09:22.092279 6552 handler.go:208] Removed *v1.Node event handler 2\\\\nI1006 12:09:22.092312 6552 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1006 12:09:22.092368 6552 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1006 12:09:22.092410 6552 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1006 12:09:22.092437 6552 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1006 12:09:22.092439 6552 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1006 12:09:22.092490 6552 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1006 12:09:22.093182 6552 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1006 12:09:22.093212 6552 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1006 12:09:22.093249 6552 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1006 12:09:22.093697 6552 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1006 12:09:22.094233 6552 factory.go:656] Stopping watch factory\\\\nI1006 12:09:22.094264 6552 ovnkube.go:599] Stopped ovnkube\\\\nI1006 12:09:22.094296 6552 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1006 12:09:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:53Z\\\",\\\"message\\\":\\\"ow:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 12:09:53.058262 6912 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1006 12:09:53.058500 6912 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.241974 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.258589 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e90ba8e5-3cfd-4f9d-a47b-b6ff52a4dffa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023bdeb58d3b704a5c9ebed84a02077257e3fc1a3de16fe138ca95e7c2bbae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.281302 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.281396 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.281421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.281456 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.281481 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.292609 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.316506 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.337849 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.357630 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.377410 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.384627 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.384691 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.384715 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.384744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.384792 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.398545 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.415847 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.440450 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.459959 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.477473 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.487576 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.487642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.487661 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.487689 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.487710 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.496826 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.519251 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.538066 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.554764 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.569885 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.590053 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.590105 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.590122 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.590146 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.590162 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.692470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.692534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.692553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.692577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.692593 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.697086 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/3.log" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.701870 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:09:54 crc kubenswrapper[4892]: E1006 12:09:54.702245 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.720008 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f0107ee8-a9e2-4a14-b044-1c37a9df4d38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e91febbfb327e02303febba524718a86630352d5a4576ecb1993fd3225fef8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xjh7n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4t26s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.737149 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b3b9c3-f7a1-42cc-9108-e8afc1f27ad2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0fd03943cfc40c58d9d067e23f84cad0eb841507f178f624e5e4194118034ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://75ada5f7a7a21cfa947d03dfd3cd1c1b8408e05dcce85e9630d3781a7ea9bb4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qkcnd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:07Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9zj8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.755166 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bf88v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d042dea2-ba2d-4825-a01c-79d5eb2fc912\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kfgmm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:09:08Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bf88v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.772736 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dbdc9e-a080-45ce-a44c-cfe6397e62dc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abf7b8f80fd3466a26a4fe0ba359ef7a107576804f434abd199ba36e94c6f173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://30fe31ea2c850a3034d89c02c61b23bb8229a3882c901ca019a16ef8df0e7569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ded3f9c0673516a114186dd487766ba8cb43f033cabb37ad2a70df107b637897\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c2613be6b8b94c0bfe79d1a7913e53f100d38e4af476f9ae86963837169ec0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.790772 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42a04209f8016ae2e4007c4162acd9bd577c54932305c70acd83326d51e6e458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.794834 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.794873 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.794886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.794904 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.794918 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.808701 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://467fa3edd2a11c99dafe3cf272c490f3a879150395d86d9d91296baf89fa8fe2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.823377 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-scxvc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"452fb3c0-569f-4c83-ba2a-7e3bafcd509d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb12dc0af423e2cfff9c36643d5f5e13f514817da1192bb0da3d384fcacf1631\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hq6vg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:54Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-scxvc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.842813 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f62b4c4040192ee736bf4673607023c6a5ae377e174efdd4adff31ea719b99c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b85262d43d2aa552c1b7a4614b9723aa00c4a3831f1469012829e63c4dedda6a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.863837 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-5zfsp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"df1cea25-4170-457d-b579-2678161d7d53\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:42Z\\\",\\\"message\\\":\\\"2025-10-06T12:08:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d\\\\n2025-10-06T12:08:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a30e28f0-5df1-4819-85d8-521b99d1457d to /host/opt/cni/bin/\\\\n2025-10-06T12:08:57Z [verbose] multus-daemon started\\\\n2025-10-06T12:08:57Z [verbose] Readiness Indicator file check\\\\n2025-10-06T12:09:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h7x7k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-5zfsp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.889215 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e115ba33-9ba0-42d6-82a0-09ef8c996788\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-06T12:09:53Z\\\",\\\"message\\\":\\\"ow:map[addresses:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]} options:{GoMap:map[iface-id-ver:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:5c 10.217.0.92]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c94130be-172c-477c-88c4-40cc7eba30fe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1006 12:09:53.058262 6912 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF1006 12:09:53.058500 6912 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:53Z is after 2025-08\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swtk8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-cxmhh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.897411 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.897479 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.897505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.897535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.897559 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:54Z","lastTransitionTime":"2025-10-06T12:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.908574 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1de7c8c2-62f5-401a-ace1-9eeeda43672d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f894fe8fc253cff359f46c47dd3fd27cbf48f019564811a94372eb4add8c9d82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://095d0ad8774546ad62be6994b5acd753cce4bfcdf069f030e3c9913c49c7fb25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a9468a8163262750bfd5ca669ce6be444e75d49a51c0f4386525cea817d2dd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98057c8db68daa02c7bbed6655e98e0f80328e5cfe2b206855ace57896296a9c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.923375 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e90ba8e5-3cfd-4f9d-a47b-b6ff52a4dffa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://023bdeb58d3b704a5c9ebed84a02077257e3fc1a3de16fe138ca95e7c2bbae42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af9f75431f5a81753a1ce88c9497e61efd5169928b3a1571ea19f335147e1ecb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.953858 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6203df4-75a4-41e7-b75a-e4b0b852d354\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75d6301d7d56abc77a6f8f08c77c2a1781c9688a2366a0b6a66e837d5bfe800a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e66396af781abbd62f3dd18e15c3512671bfe5542852905795a69543c23e34f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67f6ad6c1b80469867945300453f80fb29665437a64f60bd7ed16b77ab3ec4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3aa1d95d2d3fe354af1887187b87761b4f2547b289329f7793cd683e80eab4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea185eae151acae1467933d0bedbef3280164e5dbb2c5721f48d31272e9156e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://45b2a188acf352de029e6dd4267f770f5186f9e17e0e441e01bd218f32e6df9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a9bfceeacd2e3c0bdae2a3a7b520d0be48d997f20aa7d90641f870d8eec81f4d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5def2ae20cb9d2b087a15b13a3c5c9f92f5647047ae94bbbb8f2151dfea43f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:54 crc kubenswrapper[4892]: I1006 12:09:54.976838 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d48842-fa04-46ee-99e9-338e72824cff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a8fc116654862a7fb5f2c328c195c746ad48ea04a85a416dc642e99d8d9ded5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e0434366eec924100f5f9cf051aa2ab32133082eca0995b72d1c6ee3502be85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44e6655b168bc62f371b058f7a0b4ddb737cfc553f470e2556271fe8375d3273\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d403a8222fc5b0e3d0911ebde06e8c7a40c4890fb5f8fdde4b6ec7274e67c34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a30224a8d5ed2f97a0de495f9f158e7391cf5b3160ca67e45783eba304b64c1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"file observer\\\\nW1006 12:08:53.987407 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1006 12:08:53.987547 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1006 12:08:53.988807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-485529856/tls.crt::/tmp/serving-cert-485529856/tls.key\\\\\\\"\\\\nI1006 12:08:54.360506 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1006 12:08:54.364792 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1006 12:08:54.364827 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1006 12:08:54.364856 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1006 12:08:54.364863 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1006 12:08:54.370579 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1006 12:08:54.370605 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370611 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1006 12:08:54.370617 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1006 12:08:54.370621 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1006 12:08:54.370620 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1006 12:08:54.370625 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1006 12:08:54.370637 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1006 12:08:54.372018 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:48Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c992e93bc01941f4b476631772ba6780a974e1032517de35bff42d4f1aff0d24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:37Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://533b031f6ba7eee937000ee45fb5f8355dc50b0cc5b71ed0bf7dcd2a3a18c505\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:34Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.000868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.000924 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.000940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.000962 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.000978 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.001716 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9b947557-ac75-461e-8603-9c3ce29ad5ab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:09:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://333fe52ee31b3dbfb19b759566c828b7c46d0a57964189aabcce9d64cadee39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:09:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd25be3c3cc1b9dafc6a8a18661544f31569e4ce2eae7838cebef7b3bb3a1e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0200c8b1a8d1c5c5653963c0bd188991f975bfe208eddbe3b461b6df41e2d30\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://208d80280968f4cc89626106829c6c6d8a16cf1d672980f4e721d167a19bc6c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc16e425078cd4c1b575164e438b5a44f2bc6d3a71292276c98592be8fdc1606\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:08:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:08:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a04a17a8ca18f61c05281a30180919bc5e6c254cea0fe6d7a6a5b746cb5262a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5d421ac608e0ad27877741b16979b5a2215787dd0e8c7e5a713f389f8efc2bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-06T12:09:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-06T12:09:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-f49rw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xnzdd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:54Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.022427 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.041810 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.062862 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:54Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.078619 4892 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-djjtr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1b933165-d6e5-4add-ada2-6c87697e668b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-06T12:08:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://abdc85a86b75cf4297424815fb85e56c9ae340f2b9bfb565a2e10599297486c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-06T12:08:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-06T12:08:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-djjtr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:09:55Z is after 2025-08-24T17:21:41Z" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.104286 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.104403 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.104424 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.104450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.104467 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.168394 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.168432 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:55 crc kubenswrapper[4892]: E1006 12:09:55.168577 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:55 crc kubenswrapper[4892]: E1006 12:09:55.168750 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.207865 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.207988 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.208009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.208031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.208048 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.311131 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.311192 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.311209 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.311237 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.311253 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.414046 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.414119 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.414142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.414184 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.414207 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.517594 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.517651 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.517671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.517697 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.517715 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.620450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.620505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.620522 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.620545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.620562 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.723828 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.723875 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.723891 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.723915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.723958 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.827195 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.827246 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.827264 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.827289 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.827306 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.929505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.929604 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.929622 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.929650 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:55 crc kubenswrapper[4892]: I1006 12:09:55.929670 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:55Z","lastTransitionTime":"2025-10-06T12:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.033708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.033755 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.033769 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.033788 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.033806 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.136732 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.136772 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.136781 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.136795 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.136805 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.168337 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.168444 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:56 crc kubenswrapper[4892]: E1006 12:09:56.168461 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:56 crc kubenswrapper[4892]: E1006 12:09:56.168621 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.239618 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.239691 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.239708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.239756 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.239776 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.342641 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.342702 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.342719 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.342742 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.342762 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.446884 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.447037 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.447054 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.447080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.447098 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.549775 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.549838 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.549857 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.549876 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.549887 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.653477 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.653560 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.653582 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.653610 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.653634 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.756201 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.756267 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.756289 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.756316 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.756370 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.859800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.859864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.859881 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.859944 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.859966 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.963501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.963563 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.963581 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.963602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:56 crc kubenswrapper[4892]: I1006 12:09:56.963619 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:56Z","lastTransitionTime":"2025-10-06T12:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.066816 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.066866 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.066883 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.066905 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.066922 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.167577 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:57 crc kubenswrapper[4892]: E1006 12:09:57.167875 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.168056 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:57 crc kubenswrapper[4892]: E1006 12:09:57.168236 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.169804 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.169868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.169886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.169909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.169928 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.272933 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.273013 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.273032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.273057 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.273074 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.376424 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.376505 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.376529 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.376559 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.376583 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.479535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.479608 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.479629 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.479657 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.479677 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.582862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.582937 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.582958 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.582987 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.583009 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.685919 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.685978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.686001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.686032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.686109 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.788721 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.788809 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.788833 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.788855 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.788873 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.891776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.891831 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.891848 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.891872 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.891889 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.990015 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:09:57 crc kubenswrapper[4892]: E1006 12:09:57.990209 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.990173405 +0000 UTC m=+148.539879210 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.995281 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.995369 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.995391 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.995419 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:57 crc kubenswrapper[4892]: I1006 12:09:57.995436 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:57Z","lastTransitionTime":"2025-10-06T12:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.091510 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.091572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.091592 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.091619 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091689 4892 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091723 4892 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091745 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091787 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091807 4892 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091740 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.091724015 +0000 UTC m=+148.641429780 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091855 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.091827839 +0000 UTC m=+148.641533644 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091878 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.09186616 +0000 UTC m=+148.641571965 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091898 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091965 4892 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.091990 4892 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.092086 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.092057116 +0000 UTC m=+148.641762921 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.098083 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.098125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.098142 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.098166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.098184 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.168590 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.168651 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.168782 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:09:58 crc kubenswrapper[4892]: E1006 12:09:58.168909 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.200909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.200963 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.200980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.201005 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.201029 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.304506 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.304582 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.304601 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.304628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.304646 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.407914 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.407964 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.407980 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.408004 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.408022 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.510961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.511020 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.511043 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.511070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.511089 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.614106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.614161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.614178 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.614198 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.614216 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.716692 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.716744 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.716761 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.716782 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.716798 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.819856 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.819940 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.819961 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.819986 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.820003 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.923126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.923183 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.923203 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.923228 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:58 crc kubenswrapper[4892]: I1006 12:09:58.923248 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:58Z","lastTransitionTime":"2025-10-06T12:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.029793 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.030128 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.030221 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.030272 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.030300 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.133180 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.133261 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.133285 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.133313 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.133394 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.167924 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.167943 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:09:59 crc kubenswrapper[4892]: E1006 12:09:59.168104 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:09:59 crc kubenswrapper[4892]: E1006 12:09:59.168259 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.236484 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.236543 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.236562 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.236585 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.236604 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.339820 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.339886 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.339904 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.339928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.339945 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.443066 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.443137 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.443160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.443188 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.443214 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.545754 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.545817 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.545840 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.545867 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.545888 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.649504 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.649574 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.649592 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.649615 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.649632 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.753498 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.753575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.753598 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.753628 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.753651 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.857079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.857151 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.857173 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.857202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.857225 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.960795 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.960928 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.960953 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.960987 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:09:59 crc kubenswrapper[4892]: I1006 12:09:59.961008 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:09:59Z","lastTransitionTime":"2025-10-06T12:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.064089 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.064213 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.064233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.064257 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.064274 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.166746 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.166812 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.166829 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.166854 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.166873 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.167582 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.167615 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:00 crc kubenswrapper[4892]: E1006 12:10:00.167775 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:00 crc kubenswrapper[4892]: E1006 12:10:00.167937 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.270051 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.270109 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.270126 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.270148 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.270165 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.372978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.373032 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.373049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.373070 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.373087 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.476079 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.476140 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.476158 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.476181 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.476200 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.579468 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.579537 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.579555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.579579 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.579597 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.683251 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.683318 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.683362 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.683387 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.683405 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.786294 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.786394 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.786421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.786447 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.786467 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.889276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.889367 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.889384 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.889407 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.889423 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.991745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.991819 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.991837 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.991860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:00 crc kubenswrapper[4892]: I1006 12:10:00.991879 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:00Z","lastTransitionTime":"2025-10-06T12:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.094602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.094653 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.094661 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.094673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.094681 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.168153 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.168204 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:01 crc kubenswrapper[4892]: E1006 12:10:01.168444 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:01 crc kubenswrapper[4892]: E1006 12:10:01.168609 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.204364 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.204445 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.204458 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.204478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.204490 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.308029 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.308106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.308124 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.308149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.308167 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.410893 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.410946 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.410956 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.410970 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.410980 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.513836 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.513892 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.513908 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.513962 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.513983 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.616728 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.616786 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.616802 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.616824 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.616842 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.724560 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.724690 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.724721 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.724800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.724824 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.827973 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.828031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.828049 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.828072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.828088 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.930401 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.930461 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.930478 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.930501 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:01 crc kubenswrapper[4892]: I1006 12:10:01.930517 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:01Z","lastTransitionTime":"2025-10-06T12:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.032888 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.032924 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.032934 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.032948 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.032958 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.136507 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.136575 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.136588 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.136606 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.136618 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.168141 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.168222 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:02 crc kubenswrapper[4892]: E1006 12:10:02.168366 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:02 crc kubenswrapper[4892]: E1006 12:10:02.168464 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.239614 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.239681 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.239699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.239723 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.239740 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.342492 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.342555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.342572 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.342596 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.342617 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.351303 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.351415 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.351441 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.351471 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.351492 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: E1006 12:10:02.372572 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:10:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.377919 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.378017 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.378040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.378072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.378095 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: E1006 12:10:02.399527 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:10:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.405018 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.405111 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.405130 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.405154 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.405172 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: E1006 12:10:02.424576 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:10:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.429353 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.429410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.429427 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.429450 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.429468 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: E1006 12:10:02.448803 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:10:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.453472 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.453522 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.453540 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.453564 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.453581 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: E1006 12:10:02.472788 4892 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-06T12:10:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f7bf1197-2aff-4edc-bce6-57187119027c\\\",\\\"systemUUID\\\":\\\"2d0b290c-b340-4076-a23d-1a9b47beb5f4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-06T12:10:02Z is after 2025-08-24T17:21:41Z" Oct 06 12:10:02 crc kubenswrapper[4892]: E1006 12:10:02.473115 4892 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.475279 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.475362 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.475379 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.475404 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.475423 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.578311 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.578413 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.578430 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.578464 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.578489 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.681223 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.681270 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.681287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.681310 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.681357 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.784493 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.784552 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.784568 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.784592 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.784610 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.888160 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.888235 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.888259 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.888287 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.888307 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.991470 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.991596 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.991619 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.991642 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:02 crc kubenswrapper[4892]: I1006 12:10:02.991659 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:02Z","lastTransitionTime":"2025-10-06T12:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.094721 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.094783 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.094805 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.094832 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.094857 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.168201 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.168377 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:03 crc kubenswrapper[4892]: E1006 12:10:03.168736 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:03 crc kubenswrapper[4892]: E1006 12:10:03.168868 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.197678 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.197730 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.197750 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.197777 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.197799 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.300624 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.300671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.300691 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.300718 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.300738 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.404139 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.404191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.404208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.404230 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.404246 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.506705 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.506755 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.506766 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.506785 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.506797 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.609870 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.609927 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.609944 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.610016 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.610035 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.712780 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.712842 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.712860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.712882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.712898 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.815989 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.816053 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.816076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.816106 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.816127 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.919790 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.919860 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.919883 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.919913 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:03 crc kubenswrapper[4892]: I1006 12:10:03.919938 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:03Z","lastTransitionTime":"2025-10-06T12:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.022915 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.022976 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.022994 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.023018 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.023035 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.126202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.126277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.126300 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.126362 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.126398 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.168474 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:04 crc kubenswrapper[4892]: E1006 12:10:04.168654 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.168752 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:04 crc kubenswrapper[4892]: E1006 12:10:04.169005 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.225719 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-5zfsp" podStartSLOduration=69.225687243 podStartE2EDuration="1m9.225687243s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.225224937 +0000 UTC m=+90.774930792" watchObservedRunningTime="2025-10-06 12:10:04.225687243 +0000 UTC m=+90.775393058" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.229767 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.229904 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.229946 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.229978 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.229999 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.289835 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.289818134 podStartE2EDuration="39.289818134s" podCreationTimestamp="2025-10-06 12:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.289495233 +0000 UTC m=+90.839201088" watchObservedRunningTime="2025-10-06 12:10:04.289818134 +0000 UTC m=+90.839523909" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.304441 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=23.304411121 podStartE2EDuration="23.304411121s" podCreationTimestamp="2025-10-06 12:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.303851832 +0000 UTC m=+90.853557667" watchObservedRunningTime="2025-10-06 12:10:04.304411121 +0000 UTC m=+90.854116926" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.333209 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.333274 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.333293 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.333317 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.333371 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.349914 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.34988902 podStartE2EDuration="1m10.34988902s" podCreationTimestamp="2025-10-06 12:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.349719874 +0000 UTC m=+90.899425699" watchObservedRunningTime="2025-10-06 12:10:04.34988902 +0000 UTC m=+90.899594825" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.410701 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xnzdd" podStartSLOduration=69.410676419 podStartE2EDuration="1m9.410676419s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.408838238 +0000 UTC m=+90.958544043" watchObservedRunningTime="2025-10-06 12:10:04.410676419 +0000 UTC m=+90.960382224" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.411166 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.411154065 podStartE2EDuration="1m9.411154065s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.376037373 +0000 UTC m=+90.925743188" watchObservedRunningTime="2025-10-06 12:10:04.411154065 +0000 UTC m=+90.960859860" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.437081 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.437161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.437204 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.437243 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.437266 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.489213 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-djjtr" podStartSLOduration=69.489186751 podStartE2EDuration="1m9.489186751s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.48915598 +0000 UTC m=+91.038861785" watchObservedRunningTime="2025-10-06 12:10:04.489186751 +0000 UTC m=+91.038892556" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.501201 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podStartSLOduration=69.501177571 podStartE2EDuration="1m9.501177571s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.50085132 +0000 UTC m=+91.050557105" watchObservedRunningTime="2025-10-06 12:10:04.501177571 +0000 UTC m=+91.050883346" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.516234 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9zj8" podStartSLOduration=69.516206993 podStartE2EDuration="1m9.516206993s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.51493008 +0000 UTC m=+91.064635885" watchObservedRunningTime="2025-10-06 12:10:04.516206993 +0000 UTC m=+91.065912798" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.540861 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.540945 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.540971 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.541004 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.541023 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.576120 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.576095212 podStartE2EDuration="1m10.576095212s" podCreationTimestamp="2025-10-06 12:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.550161887 +0000 UTC m=+91.099867732" watchObservedRunningTime="2025-10-06 12:10:04.576095212 +0000 UTC m=+91.125800987" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.606236 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-scxvc" podStartSLOduration=70.606212838 podStartE2EDuration="1m10.606212838s" podCreationTimestamp="2025-10-06 12:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:04.604748229 +0000 UTC m=+91.154454004" watchObservedRunningTime="2025-10-06 12:10:04.606212838 +0000 UTC m=+91.155918643" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.643799 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.643864 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.643882 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.643909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.643926 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.746425 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.746482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.746499 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.746521 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.746539 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.849373 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.849410 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.849419 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.849436 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.849444 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.952061 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.952162 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.952189 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.952219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:04 crc kubenswrapper[4892]: I1006 12:10:04.952243 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:04Z","lastTransitionTime":"2025-10-06T12:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.055273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.055347 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.055364 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.055382 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.055399 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.158143 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.158210 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.158229 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.158254 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.158271 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.167884 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.167906 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:05 crc kubenswrapper[4892]: E1006 12:10:05.168095 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:05 crc kubenswrapper[4892]: E1006 12:10:05.168220 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.261633 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.261712 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.261736 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.261806 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.261831 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.364909 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.364984 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.365009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.365037 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.365060 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.468098 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.468174 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.468197 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.468227 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.468269 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.571571 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.571641 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.571660 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.571688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.571706 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.674611 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.674673 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.674692 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.674717 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.674736 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.777169 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.777214 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.777224 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.777240 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.777256 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.879899 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.879942 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.879954 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.879970 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.879982 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.983007 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.983058 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.983076 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.983096 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:05 crc kubenswrapper[4892]: I1006 12:10:05.983110 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:05Z","lastTransitionTime":"2025-10-06T12:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.086290 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.086700 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.086717 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.086742 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.086763 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.168543 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:06 crc kubenswrapper[4892]: E1006 12:10:06.169211 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.168988 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:06 crc kubenswrapper[4892]: E1006 12:10:06.169734 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.190103 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.190149 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.190166 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.190188 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.190207 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.292273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.292315 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.292354 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.292369 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.292381 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.395440 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.395493 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.395509 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.395535 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.395553 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.497907 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.498371 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.498547 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.498700 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.498824 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.601783 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.602276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.602482 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.602659 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.602827 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.706191 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.706259 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.706277 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.706301 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.706348 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.809088 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.809137 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.809146 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.809163 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.809173 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.912193 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.912276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.912305 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.912351 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:06 crc kubenswrapper[4892]: I1006 12:10:06.912369 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:06Z","lastTransitionTime":"2025-10-06T12:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.014676 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.014802 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.014826 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.014862 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.014885 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.117727 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.117803 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.117826 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.117856 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.117880 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.168268 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:07 crc kubenswrapper[4892]: E1006 12:10:07.168415 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.168276 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:07 crc kubenswrapper[4892]: E1006 12:10:07.168621 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.220774 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.220815 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.220833 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.220854 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.220871 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.324421 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.324837 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.325031 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.325190 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.325385 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.428671 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.428727 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.428745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.428770 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.428787 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.531009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.531062 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.531080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.531104 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.531120 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.634393 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.634467 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.634485 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.634509 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.634528 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.737125 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.737182 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.737199 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.737222 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.737239 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.840512 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.840566 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.840582 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.840602 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.840621 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.943937 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.943983 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.943999 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.944019 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:07 crc kubenswrapper[4892]: I1006 12:10:07.944034 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:07Z","lastTransitionTime":"2025-10-06T12:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.047641 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.047719 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.047745 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.047778 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.047813 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.150967 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.151030 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.151048 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.151072 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.151090 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.168482 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:08 crc kubenswrapper[4892]: E1006 12:10:08.168780 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.169041 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:08 crc kubenswrapper[4892]: E1006 12:10:08.169397 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.254475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.254655 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.254679 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.254708 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.254730 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.358457 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.358534 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.358555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.358580 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.358598 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.461594 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.461648 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.461656 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.461676 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.461688 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.565038 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.565084 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.565100 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.565123 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.565145 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.668169 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.668233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.668252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.668280 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.668299 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.771776 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.771831 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.771846 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.771869 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.771886 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.874688 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.874735 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.875174 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.875226 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.875246 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.981894 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.981955 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.981972 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.981997 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:08 crc kubenswrapper[4892]: I1006 12:10:08.982014 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:08Z","lastTransitionTime":"2025-10-06T12:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.085651 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.085728 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.085753 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.085800 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.085824 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.168288 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.168369 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:09 crc kubenswrapper[4892]: E1006 12:10:09.169282 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.169437 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:10:09 crc kubenswrapper[4892]: E1006 12:10:09.169558 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:09 crc kubenswrapper[4892]: E1006 12:10:09.169689 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.188179 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.188250 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.188276 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.188307 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.188373 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.291090 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.291152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.291170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.291193 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.291210 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.394087 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.394152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.394170 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.394196 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.394214 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.497911 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.497985 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.498001 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.498026 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.498044 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.601445 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.601512 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.601531 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.601555 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.601574 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.704161 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.704225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.704244 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.704273 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.704293 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.807406 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.807491 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.807515 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.807545 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.807568 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.909965 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.910009 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.910020 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.910035 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:09 crc kubenswrapper[4892]: I1006 12:10:09.910046 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:09Z","lastTransitionTime":"2025-10-06T12:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.012187 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.012231 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.012243 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.012260 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.012273 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.115319 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.115374 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.115387 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.115402 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.115414 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.168598 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.168649 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:10 crc kubenswrapper[4892]: E1006 12:10:10.168828 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:10 crc kubenswrapper[4892]: E1006 12:10:10.168971 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.218150 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.218209 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.218225 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.218247 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.218263 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.321711 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.321797 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.321821 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.321856 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.321873 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.424556 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.424600 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.424610 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.424625 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.424635 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.527180 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.527236 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.527252 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.527274 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.527291 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.630492 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.630537 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.630548 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.630565 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.630577 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.733475 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.733553 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.733584 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.733616 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.733640 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.836136 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.836196 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.836218 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.836245 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.836268 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.939129 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.939224 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.939251 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.939283 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:10 crc kubenswrapper[4892]: I1006 12:10:10.939318 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:10Z","lastTransitionTime":"2025-10-06T12:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.042810 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.042845 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.042854 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.042868 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.042877 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.145607 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.145683 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.145707 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.145739 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.145763 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.168690 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.168729 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:11 crc kubenswrapper[4892]: E1006 12:10:11.168851 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:11 crc kubenswrapper[4892]: E1006 12:10:11.169026 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.248996 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.249043 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.249060 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.249081 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.249098 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.352233 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.352285 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.352297 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.352341 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.352355 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.455208 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.455268 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.455285 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.455310 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.455360 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.558790 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.558848 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.558870 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.558897 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.558914 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.666088 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.666511 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.666667 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.666822 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.666974 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.770411 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.770463 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.770480 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.770502 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.770521 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.874007 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.874111 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.874130 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.874186 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.874203 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.977680 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.977741 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.977759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.977787 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:11 crc kubenswrapper[4892]: I1006 12:10:11.977804 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:11Z","lastTransitionTime":"2025-10-06T12:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.080964 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.081040 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.081053 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.081080 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.081094 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:12Z","lastTransitionTime":"2025-10-06T12:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.168370 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.168558 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:12 crc kubenswrapper[4892]: E1006 12:10:12.168723 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:12 crc kubenswrapper[4892]: E1006 12:10:12.168905 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.184480 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.184551 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.184577 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.184610 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.184634 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:12Z","lastTransitionTime":"2025-10-06T12:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.288129 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.288202 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.288219 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.288258 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.288277 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:12Z","lastTransitionTime":"2025-10-06T12:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.391094 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.391152 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.391169 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.391197 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.391216 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:12Z","lastTransitionTime":"2025-10-06T12:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.494264 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.494317 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.494380 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.494412 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.494432 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:12Z","lastTransitionTime":"2025-10-06T12:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.525613 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.525662 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.525678 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.525699 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.525717 4892 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-06T12:10:12Z","lastTransitionTime":"2025-10-06T12:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.614754 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29"] Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.615125 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.617570 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.617606 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.617745 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.619945 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.754519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dead480f-8f25-49cc-83e8-4ee6f6149e32-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.754598 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dead480f-8f25-49cc-83e8-4ee6f6149e32-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.754727 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dead480f-8f25-49cc-83e8-4ee6f6149e32-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.754886 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dead480f-8f25-49cc-83e8-4ee6f6149e32-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.754945 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dead480f-8f25-49cc-83e8-4ee6f6149e32-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.856163 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dead480f-8f25-49cc-83e8-4ee6f6149e32-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.856239 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dead480f-8f25-49cc-83e8-4ee6f6149e32-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.856291 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dead480f-8f25-49cc-83e8-4ee6f6149e32-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.856409 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dead480f-8f25-49cc-83e8-4ee6f6149e32-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.856463 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dead480f-8f25-49cc-83e8-4ee6f6149e32-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.856573 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/dead480f-8f25-49cc-83e8-4ee6f6149e32-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.856931 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/dead480f-8f25-49cc-83e8-4ee6f6149e32-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.858011 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dead480f-8f25-49cc-83e8-4ee6f6149e32-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.866372 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dead480f-8f25-49cc-83e8-4ee6f6149e32-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.886139 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dead480f-8f25-49cc-83e8-4ee6f6149e32-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dnh29\" (UID: \"dead480f-8f25-49cc-83e8-4ee6f6149e32\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.931746 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" Oct 06 12:10:12 crc kubenswrapper[4892]: I1006 12:10:12.957056 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:12 crc kubenswrapper[4892]: E1006 12:10:12.957257 4892 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:10:12 crc kubenswrapper[4892]: E1006 12:10:12.957381 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs podName:d042dea2-ba2d-4825-a01c-79d5eb2fc912 nodeName:}" failed. No retries permitted until 2025-10-06 12:11:16.957349855 +0000 UTC m=+163.507055660 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs") pod "network-metrics-daemon-bf88v" (UID: "d042dea2-ba2d-4825-a01c-79d5eb2fc912") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 06 12:10:13 crc kubenswrapper[4892]: I1006 12:10:13.167923 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:13 crc kubenswrapper[4892]: E1006 12:10:13.168369 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:13 crc kubenswrapper[4892]: I1006 12:10:13.167977 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:13 crc kubenswrapper[4892]: E1006 12:10:13.168452 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:13 crc kubenswrapper[4892]: I1006 12:10:13.772465 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" event={"ID":"dead480f-8f25-49cc-83e8-4ee6f6149e32","Type":"ContainerStarted","Data":"e93db534fca28eab8754cd8badfa7d70de08656f92c8bbf8a6dfed79146b992e"} Oct 06 12:10:13 crc kubenswrapper[4892]: I1006 12:10:13.772561 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" event={"ID":"dead480f-8f25-49cc-83e8-4ee6f6149e32","Type":"ContainerStarted","Data":"66a62321725e2494d0953ce8b9d906e6c7b7a170b915a66e3498e6bfb87e4c5c"} Oct 06 12:10:14 crc kubenswrapper[4892]: I1006 12:10:14.168160 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:14 crc kubenswrapper[4892]: E1006 12:10:14.169972 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:14 crc kubenswrapper[4892]: I1006 12:10:14.170162 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:14 crc kubenswrapper[4892]: E1006 12:10:14.170437 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:15 crc kubenswrapper[4892]: I1006 12:10:15.167736 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:15 crc kubenswrapper[4892]: E1006 12:10:15.167891 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:15 crc kubenswrapper[4892]: I1006 12:10:15.168243 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:15 crc kubenswrapper[4892]: E1006 12:10:15.169146 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:16 crc kubenswrapper[4892]: I1006 12:10:16.168567 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:16 crc kubenswrapper[4892]: I1006 12:10:16.168614 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:16 crc kubenswrapper[4892]: E1006 12:10:16.169033 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:16 crc kubenswrapper[4892]: E1006 12:10:16.169210 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:17 crc kubenswrapper[4892]: I1006 12:10:17.168453 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:17 crc kubenswrapper[4892]: I1006 12:10:17.168453 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:17 crc kubenswrapper[4892]: E1006 12:10:17.168683 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:17 crc kubenswrapper[4892]: E1006 12:10:17.168809 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:18 crc kubenswrapper[4892]: I1006 12:10:18.167933 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:18 crc kubenswrapper[4892]: E1006 12:10:18.168130 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:18 crc kubenswrapper[4892]: I1006 12:10:18.168403 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:18 crc kubenswrapper[4892]: E1006 12:10:18.168582 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:19 crc kubenswrapper[4892]: I1006 12:10:19.168450 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:19 crc kubenswrapper[4892]: I1006 12:10:19.168561 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:19 crc kubenswrapper[4892]: E1006 12:10:19.168671 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:19 crc kubenswrapper[4892]: E1006 12:10:19.168828 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:20 crc kubenswrapper[4892]: I1006 12:10:20.167777 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:20 crc kubenswrapper[4892]: E1006 12:10:20.168098 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:20 crc kubenswrapper[4892]: I1006 12:10:20.168399 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:20 crc kubenswrapper[4892]: E1006 12:10:20.168661 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:20 crc kubenswrapper[4892]: I1006 12:10:20.169986 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:10:20 crc kubenswrapper[4892]: E1006 12:10:20.170497 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-cxmhh_openshift-ovn-kubernetes(e115ba33-9ba0-42d6-82a0-09ef8c996788)\"" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" Oct 06 12:10:21 crc kubenswrapper[4892]: I1006 12:10:21.168154 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:21 crc kubenswrapper[4892]: I1006 12:10:21.168272 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:21 crc kubenswrapper[4892]: E1006 12:10:21.168427 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:21 crc kubenswrapper[4892]: E1006 12:10:21.168619 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:22 crc kubenswrapper[4892]: I1006 12:10:22.168593 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:22 crc kubenswrapper[4892]: I1006 12:10:22.168862 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:22 crc kubenswrapper[4892]: E1006 12:10:22.169007 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:22 crc kubenswrapper[4892]: E1006 12:10:22.169172 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:23 crc kubenswrapper[4892]: I1006 12:10:23.167719 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:23 crc kubenswrapper[4892]: I1006 12:10:23.167718 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:23 crc kubenswrapper[4892]: E1006 12:10:23.168043 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:23 crc kubenswrapper[4892]: E1006 12:10:23.168221 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:24 crc kubenswrapper[4892]: I1006 12:10:24.168667 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:24 crc kubenswrapper[4892]: I1006 12:10:24.168771 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:24 crc kubenswrapper[4892]: E1006 12:10:24.170498 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:24 crc kubenswrapper[4892]: E1006 12:10:24.170630 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:25 crc kubenswrapper[4892]: I1006 12:10:25.168712 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:25 crc kubenswrapper[4892]: I1006 12:10:25.168716 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:25 crc kubenswrapper[4892]: E1006 12:10:25.168913 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:25 crc kubenswrapper[4892]: E1006 12:10:25.169024 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:26 crc kubenswrapper[4892]: I1006 12:10:26.168409 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:26 crc kubenswrapper[4892]: I1006 12:10:26.168538 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:26 crc kubenswrapper[4892]: E1006 12:10:26.168595 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:26 crc kubenswrapper[4892]: E1006 12:10:26.168782 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:27 crc kubenswrapper[4892]: I1006 12:10:27.167762 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:27 crc kubenswrapper[4892]: I1006 12:10:27.167761 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:27 crc kubenswrapper[4892]: E1006 12:10:27.167920 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:27 crc kubenswrapper[4892]: E1006 12:10:27.168032 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.167704 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.167745 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:28 crc kubenswrapper[4892]: E1006 12:10:28.167859 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:28 crc kubenswrapper[4892]: E1006 12:10:28.168001 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.830287 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/1.log" Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.831119 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/0.log" Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.831189 4892 generic.go:334] "Generic (PLEG): container finished" podID="df1cea25-4170-457d-b579-2678161d7d53" containerID="0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe" exitCode=1 Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.831267 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zfsp" event={"ID":"df1cea25-4170-457d-b579-2678161d7d53","Type":"ContainerDied","Data":"0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe"} Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.831444 4892 scope.go:117] "RemoveContainer" containerID="7e85dfe077513c5c6b9384c6b26a04b98d41edbbc451c62686129e2c8b1a63b1" Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.832167 4892 scope.go:117] "RemoveContainer" containerID="0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe" Oct 06 12:10:28 crc kubenswrapper[4892]: E1006 12:10:28.832628 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-5zfsp_openshift-multus(df1cea25-4170-457d-b579-2678161d7d53)\"" pod="openshift-multus/multus-5zfsp" podUID="df1cea25-4170-457d-b579-2678161d7d53" Oct 06 12:10:28 crc kubenswrapper[4892]: I1006 12:10:28.853017 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dnh29" podStartSLOduration=93.852999286 podStartE2EDuration="1m33.852999286s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:13.796909717 +0000 UTC m=+100.346615512" watchObservedRunningTime="2025-10-06 12:10:28.852999286 +0000 UTC m=+115.402705051" Oct 06 12:10:29 crc kubenswrapper[4892]: I1006 12:10:29.168215 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:29 crc kubenswrapper[4892]: I1006 12:10:29.168243 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:29 crc kubenswrapper[4892]: E1006 12:10:29.169232 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:29 crc kubenswrapper[4892]: E1006 12:10:29.169434 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:29 crc kubenswrapper[4892]: I1006 12:10:29.836264 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/1.log" Oct 06 12:10:30 crc kubenswrapper[4892]: I1006 12:10:30.168574 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:30 crc kubenswrapper[4892]: E1006 12:10:30.168712 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:30 crc kubenswrapper[4892]: I1006 12:10:30.168856 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:30 crc kubenswrapper[4892]: E1006 12:10:30.169022 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:31 crc kubenswrapper[4892]: I1006 12:10:31.168438 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:31 crc kubenswrapper[4892]: I1006 12:10:31.168462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:31 crc kubenswrapper[4892]: E1006 12:10:31.168648 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:31 crc kubenswrapper[4892]: E1006 12:10:31.168769 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:32 crc kubenswrapper[4892]: I1006 12:10:32.168593 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:32 crc kubenswrapper[4892]: E1006 12:10:32.168800 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:32 crc kubenswrapper[4892]: I1006 12:10:32.168922 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:32 crc kubenswrapper[4892]: E1006 12:10:32.169076 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:33 crc kubenswrapper[4892]: I1006 12:10:33.168389 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:33 crc kubenswrapper[4892]: I1006 12:10:33.168390 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:33 crc kubenswrapper[4892]: E1006 12:10:33.169215 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:33 crc kubenswrapper[4892]: E1006 12:10:33.169448 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:33 crc kubenswrapper[4892]: I1006 12:10:33.169612 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:10:33 crc kubenswrapper[4892]: I1006 12:10:33.853756 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/3.log" Oct 06 12:10:33 crc kubenswrapper[4892]: I1006 12:10:33.858013 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerStarted","Data":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} Oct 06 12:10:33 crc kubenswrapper[4892]: I1006 12:10:33.858420 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:10:33 crc kubenswrapper[4892]: I1006 12:10:33.894463 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podStartSLOduration=98.894440537 podStartE2EDuration="1m38.894440537s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:33.891168617 +0000 UTC m=+120.440874422" watchObservedRunningTime="2025-10-06 12:10:33.894440537 +0000 UTC m=+120.444146342" Oct 06 12:10:34 crc kubenswrapper[4892]: I1006 12:10:34.118546 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bf88v"] Oct 06 12:10:34 crc kubenswrapper[4892]: I1006 12:10:34.118658 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:34 crc kubenswrapper[4892]: E1006 12:10:34.118752 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:34 crc kubenswrapper[4892]: I1006 12:10:34.167634 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:34 crc kubenswrapper[4892]: I1006 12:10:34.167679 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:34 crc kubenswrapper[4892]: E1006 12:10:34.169923 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:34 crc kubenswrapper[4892]: E1006 12:10:34.169817 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:34 crc kubenswrapper[4892]: E1006 12:10:34.173335 4892 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 06 12:10:34 crc kubenswrapper[4892]: E1006 12:10:34.278836 4892 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 12:10:35 crc kubenswrapper[4892]: I1006 12:10:35.168605 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:35 crc kubenswrapper[4892]: E1006 12:10:35.169089 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:35 crc kubenswrapper[4892]: I1006 12:10:35.168626 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:35 crc kubenswrapper[4892]: E1006 12:10:35.169270 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:36 crc kubenswrapper[4892]: I1006 12:10:36.168280 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:36 crc kubenswrapper[4892]: E1006 12:10:36.168524 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:36 crc kubenswrapper[4892]: I1006 12:10:36.168966 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:36 crc kubenswrapper[4892]: E1006 12:10:36.169108 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:37 crc kubenswrapper[4892]: I1006 12:10:37.167742 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:37 crc kubenswrapper[4892]: I1006 12:10:37.167741 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:37 crc kubenswrapper[4892]: E1006 12:10:37.167966 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:37 crc kubenswrapper[4892]: E1006 12:10:37.168047 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:38 crc kubenswrapper[4892]: I1006 12:10:38.168462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:38 crc kubenswrapper[4892]: I1006 12:10:38.168561 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:38 crc kubenswrapper[4892]: E1006 12:10:38.168642 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:38 crc kubenswrapper[4892]: E1006 12:10:38.168785 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:39 crc kubenswrapper[4892]: I1006 12:10:39.168451 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:39 crc kubenswrapper[4892]: I1006 12:10:39.168482 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:39 crc kubenswrapper[4892]: I1006 12:10:39.169100 4892 scope.go:117] "RemoveContainer" containerID="0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe" Oct 06 12:10:39 crc kubenswrapper[4892]: E1006 12:10:39.169624 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:39 crc kubenswrapper[4892]: E1006 12:10:39.169785 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:39 crc kubenswrapper[4892]: E1006 12:10:39.283525 4892 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 12:10:39 crc kubenswrapper[4892]: I1006 12:10:39.882386 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/1.log" Oct 06 12:10:39 crc kubenswrapper[4892]: I1006 12:10:39.882469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zfsp" event={"ID":"df1cea25-4170-457d-b579-2678161d7d53","Type":"ContainerStarted","Data":"1f18db21adfe184eeb4fb4e20b10f5e36bb64cb873e1ad45648cc412d4cba0eb"} Oct 06 12:10:40 crc kubenswrapper[4892]: I1006 12:10:40.168211 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:40 crc kubenswrapper[4892]: E1006 12:10:40.168496 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:40 crc kubenswrapper[4892]: I1006 12:10:40.168580 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:40 crc kubenswrapper[4892]: E1006 12:10:40.168707 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:41 crc kubenswrapper[4892]: I1006 12:10:41.167710 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:41 crc kubenswrapper[4892]: E1006 12:10:41.168477 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:41 crc kubenswrapper[4892]: I1006 12:10:41.167789 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:41 crc kubenswrapper[4892]: E1006 12:10:41.168752 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:42 crc kubenswrapper[4892]: I1006 12:10:42.168407 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:42 crc kubenswrapper[4892]: E1006 12:10:42.168865 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:42 crc kubenswrapper[4892]: I1006 12:10:42.168531 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:42 crc kubenswrapper[4892]: E1006 12:10:42.169118 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:43 crc kubenswrapper[4892]: I1006 12:10:43.168403 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:43 crc kubenswrapper[4892]: I1006 12:10:43.168431 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:43 crc kubenswrapper[4892]: E1006 12:10:43.168652 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bf88v" podUID="d042dea2-ba2d-4825-a01c-79d5eb2fc912" Oct 06 12:10:43 crc kubenswrapper[4892]: E1006 12:10:43.168706 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 06 12:10:44 crc kubenswrapper[4892]: I1006 12:10:44.167916 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:44 crc kubenswrapper[4892]: E1006 12:10:44.169620 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 06 12:10:44 crc kubenswrapper[4892]: I1006 12:10:44.169667 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:44 crc kubenswrapper[4892]: E1006 12:10:44.169905 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 06 12:10:45 crc kubenswrapper[4892]: I1006 12:10:45.167717 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:10:45 crc kubenswrapper[4892]: I1006 12:10:45.167801 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:10:45 crc kubenswrapper[4892]: I1006 12:10:45.170569 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 06 12:10:45 crc kubenswrapper[4892]: I1006 12:10:45.172289 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 06 12:10:45 crc kubenswrapper[4892]: I1006 12:10:45.172296 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 06 12:10:45 crc kubenswrapper[4892]: I1006 12:10:45.172352 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 06 12:10:46 crc kubenswrapper[4892]: I1006 12:10:46.168489 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:10:46 crc kubenswrapper[4892]: I1006 12:10:46.168547 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:10:46 crc kubenswrapper[4892]: I1006 12:10:46.170516 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 06 12:10:46 crc kubenswrapper[4892]: I1006 12:10:46.170523 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.414759 4892 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.456338 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wrctr"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.456794 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.458581 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d86rc"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.459101 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.459720 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.461498 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.461779 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.462075 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.462318 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.462425 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.462653 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.467001 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.472630 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.475254 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.476365 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.476650 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.479137 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-knk7q"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.479865 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.483985 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cxk5j"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.484495 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zdpfp"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.484688 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.484922 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.485393 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.485895 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.486802 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.487646 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.491868 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9q95"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.492301 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.492893 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.493528 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f59vt"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.493541 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.493903 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.498423 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.498923 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qhnln"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.499611 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qhnln" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.513801 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.513828 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.513894 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.514960 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.515689 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.515788 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516000 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516107 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516184 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516225 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516230 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516345 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516423 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516431 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516568 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516596 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516680 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516813 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.517061 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516683 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.517419 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516572 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.517817 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.517871 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.517930 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.517946 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.518121 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.516819 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.518433 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.518636 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.518851 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.520261 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.520457 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.520806 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.521056 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.521607 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.521923 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.523761 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.529772 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.529995 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.539235 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.539642 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.539747 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540036 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540177 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540206 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540253 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540470 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540630 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540759 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-client-ca\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540796 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1335413b-43df-4ec7-a45d-eb1094b8a125-serving-cert\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.540914 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-config\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.541012 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbsw\" (UniqueName: \"kubernetes.io/projected/1335413b-43df-4ec7-a45d-eb1094b8a125-kube-api-access-8vbsw\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.541065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.541417 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-74v25"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.541752 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.541411 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.544891 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.545097 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.545228 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.549576 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.550866 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.551071 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.551553 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.552095 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.552471 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.552573 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.555587 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.559856 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.559875 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.560138 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.561790 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.561959 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.562086 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.563377 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9zbfw"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.563391 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.563526 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.563783 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.564046 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.564676 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.564698 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.565150 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.565221 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.566585 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.567190 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.567441 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.567601 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.567756 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.567878 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.568020 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.580699 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.580743 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.581429 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.581701 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.581839 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.581972 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.582110 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.582273 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.582483 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.582613 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.582665 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.582742 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.582876 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.582879 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.583403 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.589818 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.590555 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.590849 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.590907 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.604664 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.605050 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.606020 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.606072 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.614464 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-g2wtm"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.614857 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.615107 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-f5gdg"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.615560 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.616043 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.616195 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.619810 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.620412 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.620957 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.621232 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.623738 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.627336 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.627777 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.628030 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.631623 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.635250 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.639867 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.641868 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-config\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.641921 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5af32f0a-19b6-4fe5-9507-d50dd25053a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.641953 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.641977 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642000 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5af32f0a-19b6-4fe5-9507-d50dd25053a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642021 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/099dffdf-1bf9-451d-8248-d4104dcdf1b6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642061 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-serving-cert\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642082 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-etcd-serving-ca\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642102 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5f891dd-b8d4-4ecc-940b-f41218193a8b-audit-dir\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642125 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbsw\" (UniqueName: \"kubernetes.io/projected/1335413b-43df-4ec7-a45d-eb1094b8a125-kube-api-access-8vbsw\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642150 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-serving-cert\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642169 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-serving-cert\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642188 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642221 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-audit-policies\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642241 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-image-import-ca\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642261 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642282 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e5f891dd-b8d4-4ecc-940b-f41218193a8b-node-pullsecrets\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642306 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642347 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-config\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642372 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-config\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642436 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07272b5f-3586-4998-8ae8-6b0365531863-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642501 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642528 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642553 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qglb4\" (UniqueName: \"kubernetes.io/projected/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-kube-api-access-qglb4\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642715 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9b646909-419f-466d-84d8-0ccd08567b52-audit-dir\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642753 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/099dffdf-1bf9-451d-8248-d4104dcdf1b6-config\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642777 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-ca\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642804 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frdzn\" (UniqueName: \"kubernetes.io/projected/9b646909-419f-466d-84d8-0ccd08567b52-kube-api-access-frdzn\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642842 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07272b5f-3586-4998-8ae8-6b0365531863-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642878 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642895 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642912 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-config\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642930 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-service-ca-bundle\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642968 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d6cd9565-520a-47d6-bb93-7423147863ef-audit-dir\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.642986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-config\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643000 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-etcd-client\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643016 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7mfx\" (UniqueName: \"kubernetes.io/projected/b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2-kube-api-access-w7mfx\") pod \"downloads-7954f5f757-qhnln\" (UID: \"b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2\") " pod="openshift-console/downloads-7954f5f757-qhnln" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643033 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/03c90169-1577-4751-bae4-ed1d7c19b416-machine-approver-tls\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643050 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643066 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgh8n\" (UniqueName: \"kubernetes.io/projected/099dffdf-1bf9-451d-8248-d4104dcdf1b6-kube-api-access-dgh8n\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643129 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-config\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643145 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643162 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dhgn\" (UniqueName: \"kubernetes.io/projected/d6cd9565-520a-47d6-bb93-7423147863ef-kube-api-access-8dhgn\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643186 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr72w\" (UniqueName: \"kubernetes.io/projected/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-kube-api-access-tr72w\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643204 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-audit\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643221 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643240 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-serving-cert\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643301 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xzb9\" (UniqueName: \"kubernetes.io/projected/3f6f15a1-6936-485d-9788-4ac1eb111cc2-kube-api-access-2xzb9\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643341 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-service-ca\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6f15a1-6936-485d-9788-4ac1eb111cc2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643385 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f6f15a1-6936-485d-9788-4ac1eb111cc2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643444 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/03c90169-1577-4751-bae4-ed1d7c19b416-auth-proxy-config\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643476 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jln9\" (UniqueName: \"kubernetes.io/projected/5af32f0a-19b6-4fe5-9507-d50dd25053a3-kube-api-access-8jln9\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643499 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643543 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-etcd-client\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643563 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643578 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f130811c-c622-4ee1-994b-cda735eaaf41-serving-cert\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643656 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-serving-cert\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643709 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03c90169-1577-4751-bae4-ed1d7c19b416-config\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643735 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9vbx\" (UniqueName: \"kubernetes.io/projected/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-kube-api-access-k9vbx\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643782 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6m6\" (UniqueName: \"kubernetes.io/projected/03c90169-1577-4751-bae4-ed1d7c19b416-kube-api-access-mt6m6\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643885 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-config\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643877 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwvxk\" (UniqueName: \"kubernetes.io/projected/f130811c-c622-4ee1-994b-cda735eaaf41-kube-api-access-fwvxk\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643958 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-audit-policies\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.643987 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wc7f\" (UniqueName: \"kubernetes.io/projected/07272b5f-3586-4998-8ae8-6b0365531863-kube-api-access-2wc7f\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644029 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-client-ca\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644086 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644130 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-client\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644166 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-client-ca\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644211 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-trusted-ca\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644238 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdvbx\" (UniqueName: \"kubernetes.io/projected/e5f891dd-b8d4-4ecc-940b-f41218193a8b-kube-api-access-rdvbx\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644264 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/099dffdf-1bf9-451d-8248-d4104dcdf1b6-images\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644294 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-encryption-config\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644353 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1335413b-43df-4ec7-a45d-eb1094b8a125-serving-cert\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644409 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.644960 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.645058 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-encryption-config\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.645140 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.645228 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnss7\" (UniqueName: \"kubernetes.io/projected/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-kube-api-access-qnss7\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.645310 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-client-ca\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.645470 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d86rc"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.645504 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncchp"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.645983 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-psw22"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.646436 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.646516 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.646625 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.646635 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.647450 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.649773 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4rs"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.650511 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.658380 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.659171 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgnbr"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.659520 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.659545 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.662983 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.669710 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.670399 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.671131 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.663941 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1335413b-43df-4ec7-a45d-eb1094b8a125-serving-cert\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.670700 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.671475 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.682556 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.684972 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-knk7q"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.685009 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-fwwjc"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.685256 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.685352 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.685430 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cxk5j"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.685445 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9q95"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.685518 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.687085 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.687903 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.687968 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.688925 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.691475 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.693227 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.708018 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.710050 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.711183 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.726460 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.728571 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zdpfp"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.729758 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.732645 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.733862 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f59vt"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.734905 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4rs"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.736038 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g2wtm"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.737134 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.738241 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9zbfw"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.738736 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.739712 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncchp"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.740927 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.742041 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-74v25"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.743132 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hfjg7"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.743950 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.744412 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jdnln"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.745809 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.746334 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-f5gdg"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.748528 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.749886 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.750441 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-etcd-client\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.750539 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f130811c-c622-4ee1-994b-cda735eaaf41-serving-cert\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.750624 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3d64b83b-4248-4a44-ada4-8fd3439d4d54-srv-cert\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.750703 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-serving-cert\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.750777 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03c90169-1577-4751-bae4-ed1d7c19b416-config\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.750847 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9vbx\" (UniqueName: \"kubernetes.io/projected/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-kube-api-access-k9vbx\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.750919 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6m6\" (UniqueName: \"kubernetes.io/projected/03c90169-1577-4751-bae4-ed1d7c19b416-kube-api-access-mt6m6\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.750988 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwvxk\" (UniqueName: \"kubernetes.io/projected/f130811c-c622-4ee1-994b-cda735eaaf41-kube-api-access-fwvxk\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.751084 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-audit-policies\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.751160 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-client\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.751262 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wc7f\" (UniqueName: \"kubernetes.io/projected/07272b5f-3586-4998-8ae8-6b0365531863-kube-api-access-2wc7f\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.751356 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-client-ca\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.751459 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752481 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3133ba3d-4738-4ae2-aa39-f651cd3d3bd1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9zbfw\" (UID: \"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752545 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-trusted-ca\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdvbx\" (UniqueName: \"kubernetes.io/projected/e5f891dd-b8d4-4ecc-940b-f41218193a8b-kube-api-access-rdvbx\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752601 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03c90169-1577-4751-bae4-ed1d7c19b416-config\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752608 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/099dffdf-1bf9-451d-8248-d4104dcdf1b6-images\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752674 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-encryption-config\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752701 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd66776b-8020-4b60-b147-fb605daea344-metrics-tls\") pod \"dns-operator-744455d44c-f5gdg\" (UID: \"dd66776b-8020-4b60-b147-fb605daea344\") " pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752732 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752754 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab463181-efdc-4a78-b735-176516f4d185-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4bs2h\" (UID: \"ab463181-efdc-4a78-b735-176516f4d185\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752778 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752797 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-metrics-certs\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752814 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c2de137a-5cfb-4e83-bc12-a51456830ecb-images\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752841 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnss7\" (UniqueName: \"kubernetes.io/projected/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-kube-api-access-qnss7\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752859 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752890 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-client-ca\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752901 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-encryption-config\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752921 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.751422 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752955 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.752991 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5af32f0a-19b6-4fe5-9507-d50dd25053a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753014 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-default-certificate\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753030 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753069 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753090 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753107 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c9c567b-051e-4d81-9f50-f575b43b3a04-service-ca-bundle\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753147 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fskx7\" (UniqueName: \"kubernetes.io/projected/3d64b83b-4248-4a44-ada4-8fd3439d4d54-kube-api-access-fskx7\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753171 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5af32f0a-19b6-4fe5-9507-d50dd25053a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753190 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/099dffdf-1bf9-451d-8248-d4104dcdf1b6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753229 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c2de137a-5cfb-4e83-bc12-a51456830ecb-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753249 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-serving-cert\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753307 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v85v\" (UniqueName: \"kubernetes.io/projected/3133ba3d-4738-4ae2-aa39-f651cd3d3bd1-kube-api-access-5v85v\") pod \"multus-admission-controller-857f4d67dd-9zbfw\" (UID: \"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753347 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2de137a-5cfb-4e83-bc12-a51456830ecb-proxy-tls\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753371 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5f891dd-b8d4-4ecc-940b-f41218193a8b-audit-dir\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753395 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-etcd-serving-ca\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753434 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81b7b337-cd16-4231-bf36-f505d7d9afb5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753464 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-serving-cert\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753496 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-serving-cert\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753525 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753590 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-audit-policies\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753622 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-audit-policies\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753625 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e5f891dd-b8d4-4ecc-940b-f41218193a8b-node-pullsecrets\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753689 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-image-import-ca\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753713 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753737 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-config\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753759 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-config\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753789 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27583c54-ec17-44a2-8240-224df02a4cbc-secret-volume\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753810 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07272b5f-3586-4998-8ae8-6b0365531863-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753829 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753850 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753871 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qglb4\" (UniqueName: \"kubernetes.io/projected/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-kube-api-access-qglb4\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753888 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjpd5\" (UniqueName: \"kubernetes.io/projected/eb1aaaad-b069-4b70-87aa-51beaea830b3-kube-api-access-jjpd5\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753910 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9b646909-419f-466d-84d8-0ccd08567b52-audit-dir\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753927 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/099dffdf-1bf9-451d-8248-d4104dcdf1b6-config\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753944 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9hxm\" (UniqueName: \"kubernetes.io/projected/c2de137a-5cfb-4e83-bc12-a51456830ecb-kube-api-access-g9hxm\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753960 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3d64b83b-4248-4a44-ada4-8fd3439d4d54-profile-collector-cert\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753977 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-config\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753994 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-ca\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754010 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frdzn\" (UniqueName: \"kubernetes.io/projected/9b646909-419f-466d-84d8-0ccd08567b52-kube-api-access-frdzn\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754015 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-trusted-ca\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754027 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07272b5f-3586-4998-8ae8-6b0365531863-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754046 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754053 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/099dffdf-1bf9-451d-8248-d4104dcdf1b6-images\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754095 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754504 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754481 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5af32f0a-19b6-4fe5-9507-d50dd25053a3-available-featuregates\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754624 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-service-ca-bundle\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754651 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d6cd9565-520a-47d6-bb93-7423147863ef-audit-dir\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754698 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9b646909-419f-466d-84d8-0ccd08567b52-audit-dir\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754972 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.754676 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxljp\" (UniqueName: \"kubernetes.io/projected/dd66776b-8020-4b60-b147-fb605daea344-kube-api-access-dxljp\") pod \"dns-operator-744455d44c-f5gdg\" (UID: \"dd66776b-8020-4b60-b147-fb605daea344\") " pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755361 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81b7b337-cd16-4231-bf36-f505d7d9afb5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755388 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-config\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755405 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-etcd-client\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755441 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7mfx\" (UniqueName: \"kubernetes.io/projected/b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2-kube-api-access-w7mfx\") pod \"downloads-7954f5f757-qhnln\" (UID: \"b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2\") " pod="openshift-console/downloads-7954f5f757-qhnln" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755458 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46ttv\" (UniqueName: \"kubernetes.io/projected/27583c54-ec17-44a2-8240-224df02a4cbc-kube-api-access-46ttv\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755478 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755514 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9kds\" (UniqueName: \"kubernetes.io/projected/aac1222e-f92a-4345-8ca2-125d2d2c2627-kube-api-access-k9kds\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755533 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/03c90169-1577-4751-bae4-ed1d7c19b416-machine-approver-tls\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755549 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755566 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgh8n\" (UniqueName: \"kubernetes.io/projected/099dffdf-1bf9-451d-8248-d4104dcdf1b6-kube-api-access-dgh8n\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755607 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/099dffdf-1bf9-451d-8248-d4104dcdf1b6-config\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755701 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-config\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.755721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.753693 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e5f891dd-b8d4-4ecc-940b-f41218193a8b-node-pullsecrets\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756629 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756459 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5f891dd-b8d4-4ecc-940b-f41218193a8b-audit-dir\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756543 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-ca\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756552 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756273 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756677 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-config\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756679 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756831 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-image-import-ca\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dhgn\" (UniqueName: \"kubernetes.io/projected/d6cd9565-520a-47d6-bb93-7423147863ef-kube-api-access-8dhgn\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756952 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27583c54-ec17-44a2-8240-224df02a4cbc-config-volume\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756969 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-serving-cert\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.756980 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr72w\" (UniqueName: \"kubernetes.io/projected/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-kube-api-access-tr72w\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757002 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-audit\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757026 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7cg\" (UniqueName: \"kubernetes.io/projected/4c9c567b-051e-4d81-9f50-f575b43b3a04-kube-api-access-6q7cg\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757066 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7c2\" (UniqueName: \"kubernetes.io/projected/ab463181-efdc-4a78-b735-176516f4d185-kube-api-access-ws7c2\") pod \"control-plane-machine-set-operator-78cbb6b69f-4bs2h\" (UID: \"ab463181-efdc-4a78-b735-176516f4d185\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757087 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb1aaaad-b069-4b70-87aa-51beaea830b3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757105 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-service-ca\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757125 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-serving-cert\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757144 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xzb9\" (UniqueName: \"kubernetes.io/projected/3f6f15a1-6936-485d-9788-4ac1eb111cc2-kube-api-access-2xzb9\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757165 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-stats-auth\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757186 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6f15a1-6936-485d-9788-4ac1eb111cc2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757208 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f6f15a1-6936-485d-9788-4ac1eb111cc2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757228 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfvx8\" (UniqueName: \"kubernetes.io/projected/caa02b36-1e64-4845-ac7a-1d9ba48b1d18-kube-api-access-hfvx8\") pod \"migrator-59844c95c7-66c2c\" (UID: \"caa02b36-1e64-4845-ac7a-1d9ba48b1d18\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757244 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b7b337-cd16-4231-bf36-f505d7d9afb5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757262 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757286 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/03c90169-1577-4751-bae4-ed1d7c19b416-auth-proxy-config\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757288 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-audit-policies\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-etcd-serving-ca\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757417 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-config\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757430 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jln9\" (UniqueName: \"kubernetes.io/projected/5af32f0a-19b6-4fe5-9507-d50dd25053a3-kube-api-access-8jln9\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757451 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb1aaaad-b069-4b70-87aa-51beaea830b3-proxy-tls\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757806 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.757965 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/03c90169-1577-4751-bae4-ed1d7c19b416-auth-proxy-config\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.760103 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-serving-cert\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.760118 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-serving-cert\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.760589 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-etcd-client\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.760640 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6f15a1-6936-485d-9788-4ac1eb111cc2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761099 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-config\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-service-ca-bundle\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761155 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d6cd9565-520a-47d6-bb93-7423147863ef-audit-dir\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761205 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f6f15a1-6936-485d-9788-4ac1eb111cc2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761368 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761508 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.762011 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761715 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-config\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761771 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761787 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgnbr"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761876 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761967 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07272b5f-3586-4998-8ae8-6b0365531863-config\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.762200 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/099dffdf-1bf9-451d-8248-d4104dcdf1b6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.761539 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e5f891dd-b8d4-4ecc-940b-f41218193a8b-audit\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.762280 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9b646909-419f-466d-84d8-0ccd08567b52-encryption-config\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.762353 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-service-ca\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.762569 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b646909-419f-466d-84d8-0ccd08567b52-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.763117 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-config\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.763171 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/03c90169-1577-4751-bae4-ed1d7c19b416-machine-approver-tls\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.763749 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5af32f0a-19b6-4fe5-9507-d50dd25053a3-serving-cert\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.764704 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.765525 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.765810 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07272b5f-3586-4998-8ae8-6b0365531863-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.767453 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.767591 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-serving-cert\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.775297 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wrctr"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.776467 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.777526 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.778853 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-etcd-client\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.778943 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-encryption-config\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.778963 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.778984 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.778985 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-serving-cert\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.779053 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e5f891dd-b8d4-4ecc-940b-f41218193a8b-etcd-client\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.779070 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.779135 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f130811c-c622-4ee1-994b-cda735eaaf41-serving-cert\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.779173 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qhnln"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.780229 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hfjg7"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.781122 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.781171 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.782205 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jdnln"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.783187 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.784206 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.786486 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.787617 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ghtz9"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.789966 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ghtz9"] Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.790062 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ghtz9" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.798946 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.818794 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858518 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c2de137a-5cfb-4e83-bc12-a51456830ecb-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858602 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fskx7\" (UniqueName: \"kubernetes.io/projected/3d64b83b-4248-4a44-ada4-8fd3439d4d54-kube-api-access-fskx7\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858665 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v85v\" (UniqueName: \"kubernetes.io/projected/3133ba3d-4738-4ae2-aa39-f651cd3d3bd1-kube-api-access-5v85v\") pod \"multus-admission-controller-857f4d67dd-9zbfw\" (UID: \"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858687 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2de137a-5cfb-4e83-bc12-a51456830ecb-proxy-tls\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858706 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81b7b337-cd16-4231-bf36-f505d7d9afb5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858739 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27583c54-ec17-44a2-8240-224df02a4cbc-secret-volume\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858766 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9hxm\" (UniqueName: \"kubernetes.io/projected/c2de137a-5cfb-4e83-bc12-a51456830ecb-kube-api-access-g9hxm\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858784 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjpd5\" (UniqueName: \"kubernetes.io/projected/eb1aaaad-b069-4b70-87aa-51beaea830b3-kube-api-access-jjpd5\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858796 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.858803 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3d64b83b-4248-4a44-ada4-8fd3439d4d54-profile-collector-cert\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859020 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxljp\" (UniqueName: \"kubernetes.io/projected/dd66776b-8020-4b60-b147-fb605daea344-kube-api-access-dxljp\") pod \"dns-operator-744455d44c-f5gdg\" (UID: \"dd66776b-8020-4b60-b147-fb605daea344\") " pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859054 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859077 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9kds\" (UniqueName: \"kubernetes.io/projected/aac1222e-f92a-4345-8ca2-125d2d2c2627-kube-api-access-k9kds\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859098 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81b7b337-cd16-4231-bf36-f505d7d9afb5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859132 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46ttv\" (UniqueName: \"kubernetes.io/projected/27583c54-ec17-44a2-8240-224df02a4cbc-kube-api-access-46ttv\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859194 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c2de137a-5cfb-4e83-bc12-a51456830ecb-auth-proxy-config\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859215 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7c2\" (UniqueName: \"kubernetes.io/projected/ab463181-efdc-4a78-b735-176516f4d185-kube-api-access-ws7c2\") pod \"control-plane-machine-set-operator-78cbb6b69f-4bs2h\" (UID: \"ab463181-efdc-4a78-b735-176516f4d185\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859243 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb1aaaad-b069-4b70-87aa-51beaea830b3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859264 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27583c54-ec17-44a2-8240-224df02a4cbc-config-volume\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859300 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7cg\" (UniqueName: \"kubernetes.io/projected/4c9c567b-051e-4d81-9f50-f575b43b3a04-kube-api-access-6q7cg\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859351 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-stats-auth\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859374 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfvx8\" (UniqueName: \"kubernetes.io/projected/caa02b36-1e64-4845-ac7a-1d9ba48b1d18-kube-api-access-hfvx8\") pod \"migrator-59844c95c7-66c2c\" (UID: \"caa02b36-1e64-4845-ac7a-1d9ba48b1d18\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859394 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b7b337-cd16-4231-bf36-f505d7d9afb5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859434 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb1aaaad-b069-4b70-87aa-51beaea830b3-proxy-tls\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859458 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3d64b83b-4248-4a44-ada4-8fd3439d4d54-srv-cert\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859676 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3133ba3d-4738-4ae2-aa39-f651cd3d3bd1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9zbfw\" (UID: \"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859785 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd66776b-8020-4b60-b147-fb605daea344-metrics-tls\") pod \"dns-operator-744455d44c-f5gdg\" (UID: \"dd66776b-8020-4b60-b147-fb605daea344\") " pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859844 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab463181-efdc-4a78-b735-176516f4d185-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4bs2h\" (UID: \"ab463181-efdc-4a78-b735-176516f4d185\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859876 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-metrics-certs\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859897 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c2de137a-5cfb-4e83-bc12-a51456830ecb-images\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859952 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-default-certificate\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.859974 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.860086 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c9c567b-051e-4d81-9f50-f575b43b3a04-service-ca-bundle\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.860578 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eb1aaaad-b069-4b70-87aa-51beaea830b3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.860751 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c2de137a-5cfb-4e83-bc12-a51456830ecb-images\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.860877 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b7b337-cd16-4231-bf36-f505d7d9afb5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.862466 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3133ba3d-4738-4ae2-aa39-f651cd3d3bd1-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9zbfw\" (UID: \"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.862477 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2de137a-5cfb-4e83-bc12-a51456830ecb-proxy-tls\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.862992 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81b7b337-cd16-4231-bf36-f505d7d9afb5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.879135 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.899440 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.920177 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.939531 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.969433 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.979291 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 06 12:10:53 crc kubenswrapper[4892]: I1006 12:10:53.999640 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.019549 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.025178 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dd66776b-8020-4b60-b147-fb605daea344-metrics-tls\") pod \"dns-operator-744455d44c-f5gdg\" (UID: \"dd66776b-8020-4b60-b147-fb605daea344\") " pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.038957 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.059049 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.080121 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.099212 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.118898 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.139185 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.159299 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.178786 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.200808 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.219563 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.239763 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.245007 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eb1aaaad-b069-4b70-87aa-51beaea830b3-proxy-tls\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.260426 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.279077 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.285400 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab463181-efdc-4a78-b735-176516f4d185-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4bs2h\" (UID: \"ab463181-efdc-4a78-b735-176516f4d185\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.301734 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.319511 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.338987 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.359710 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.379573 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.384426 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3d64b83b-4248-4a44-ada4-8fd3439d4d54-srv-cert\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.399989 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.414032 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3d64b83b-4248-4a44-ada4-8fd3439d4d54-profile-collector-cert\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.414471 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27583c54-ec17-44a2-8240-224df02a4cbc-secret-volume\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.420031 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.440109 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.480011 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.485633 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbsw\" (UniqueName: \"kubernetes.io/projected/1335413b-43df-4ec7-a45d-eb1094b8a125-kube-api-access-8vbsw\") pod \"controller-manager-879f6c89f-wrctr\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.499925 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.519213 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.539456 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.545515 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-metrics-certs\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.559809 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.564729 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-stats-auth\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.580676 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.599111 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.605789 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4c9c567b-051e-4d81-9f50-f575b43b3a04-default-certificate\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.620121 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.622277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c9c567b-051e-4d81-9f50-f575b43b3a04-service-ca-bundle\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.639254 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.657779 4892 request.go:700] Waited for 1.010772528s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.659844 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.677972 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.679958 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.700479 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.720314 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.740168 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.751586 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27583c54-ec17-44a2-8240-224df02a4cbc-config-volume\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.760766 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.787380 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.792283 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.801436 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.820038 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.831023 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.840565 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.860357 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.879643 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.899352 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.914469 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wrctr"] Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.919068 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 06 12:10:54 crc kubenswrapper[4892]: W1006 12:10:54.925107 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1335413b_43df_4ec7_a45d_eb1094b8a125.slice/crio-caf8ae585e03af8d95e558bac29784cac621a9b4eca2a862f50437e196d2e9d8 WatchSource:0}: Error finding container caf8ae585e03af8d95e558bac29784cac621a9b4eca2a862f50437e196d2e9d8: Status 404 returned error can't find the container with id caf8ae585e03af8d95e558bac29784cac621a9b4eca2a862f50437e196d2e9d8 Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.938304 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.941802 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" event={"ID":"1335413b-43df-4ec7-a45d-eb1094b8a125","Type":"ContainerStarted","Data":"caf8ae585e03af8d95e558bac29784cac621a9b4eca2a862f50437e196d2e9d8"} Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.960096 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 06 12:10:54 crc kubenswrapper[4892]: I1006 12:10:54.979260 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.020082 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.039302 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.058567 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.078287 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.098871 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.119601 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.139475 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.159105 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.185655 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.199830 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.220129 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.239580 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.259960 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.279359 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.299711 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.319442 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.340517 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.359153 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.380132 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.399922 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.420035 4892 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.439916 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.459897 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.507672 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wc7f\" (UniqueName: \"kubernetes.io/projected/07272b5f-3586-4998-8ae8-6b0365531863-kube-api-access-2wc7f\") pod \"openshift-apiserver-operator-796bbdcf4f-nzqh4\" (UID: \"07272b5f-3586-4998-8ae8-6b0365531863\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.526873 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9vbx\" (UniqueName: \"kubernetes.io/projected/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-kube-api-access-k9vbx\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.544684 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6m6\" (UniqueName: \"kubernetes.io/projected/03c90169-1577-4751-bae4-ed1d7c19b416-kube-api-access-mt6m6\") pod \"machine-approver-56656f9798-9f9mr\" (UID: \"03c90169-1577-4751-bae4-ed1d7c19b416\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.568513 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwvxk\" (UniqueName: \"kubernetes.io/projected/f130811c-c622-4ee1-994b-cda735eaaf41-kube-api-access-fwvxk\") pod \"route-controller-manager-6576b87f9c-gh9gm\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.573777 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdvbx\" (UniqueName: \"kubernetes.io/projected/e5f891dd-b8d4-4ecc-940b-f41218193a8b-kube-api-access-rdvbx\") pod \"apiserver-76f77b778f-cxk5j\" (UID: \"e5f891dd-b8d4-4ecc-940b-f41218193a8b\") " pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.604117 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qglb4\" (UniqueName: \"kubernetes.io/projected/7f3f423a-0bf7-413b-b277-5d8c5dcec3d0-kube-api-access-qglb4\") pod \"authentication-operator-69f744f599-d86rc\" (UID: \"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.627983 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7mfx\" (UniqueName: \"kubernetes.io/projected/b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2-kube-api-access-w7mfx\") pod \"downloads-7954f5f757-qhnln\" (UID: \"b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2\") " pod="openshift-console/downloads-7954f5f757-qhnln" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.639662 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnss7\" (UniqueName: \"kubernetes.io/projected/4a468eca-e75e-4bcb-84f4-27a4cc01e4cb-kube-api-access-qnss7\") pod \"console-operator-58897d9998-zdpfp\" (UID: \"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb\") " pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.652122 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.657776 4892 request.go:700] Waited for 1.897502469s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/serviceaccounts/openshift-controller-manager-operator/token Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.663955 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.670115 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr72w\" (UniqueName: \"kubernetes.io/projected/34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7-kube-api-access-tr72w\") pod \"etcd-operator-b45778765-74v25\" (UID: \"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.686891 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.692045 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xzb9\" (UniqueName: \"kubernetes.io/projected/3f6f15a1-6936-485d-9788-4ac1eb111cc2-kube-api-access-2xzb9\") pod \"openshift-controller-manager-operator-756b6f6bc6-pfsfk\" (UID: \"3f6f15a1-6936-485d-9788-4ac1eb111cc2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.700242 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgh8n\" (UniqueName: \"kubernetes.io/projected/099dffdf-1bf9-451d-8248-d4104dcdf1b6-kube-api-access-dgh8n\") pod \"machine-api-operator-5694c8668f-knk7q\" (UID: \"099dffdf-1bf9-451d-8248-d4104dcdf1b6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.720980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frdzn\" (UniqueName: \"kubernetes.io/projected/9b646909-419f-466d-84d8-0ccd08567b52-kube-api-access-frdzn\") pod \"apiserver-7bbb656c7d-xfvct\" (UID: \"9b646909-419f-466d-84d8-0ccd08567b52\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.724590 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.747021 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.749585 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dhgn\" (UniqueName: \"kubernetes.io/projected/d6cd9565-520a-47d6-bb93-7423147863ef-kube-api-access-8dhgn\") pod \"oauth-openshift-558db77b4-h9q95\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.765767 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jln9\" (UniqueName: \"kubernetes.io/projected/5af32f0a-19b6-4fe5-9507-d50dd25053a3-kube-api-access-8jln9\") pod \"openshift-config-operator-7777fb866f-f59vt\" (UID: \"5af32f0a-19b6-4fe5-9507-d50dd25053a3\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.772707 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-w9qcd\" (UID: \"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.773126 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.778168 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qhnln" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.779696 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.800674 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.806504 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.815672 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.823875 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.829050 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.836247 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.840224 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.894583 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.897398 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v85v\" (UniqueName: \"kubernetes.io/projected/3133ba3d-4738-4ae2-aa39-f651cd3d3bd1-kube-api-access-5v85v\") pod \"multus-admission-controller-857f4d67dd-9zbfw\" (UID: \"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.913495 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fskx7\" (UniqueName: \"kubernetes.io/projected/3d64b83b-4248-4a44-ada4-8fd3439d4d54-kube-api-access-fskx7\") pod \"catalog-operator-68c6474976-cqq89\" (UID: \"3d64b83b-4248-4a44-ada4-8fd3439d4d54\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.918408 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.939110 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9hxm\" (UniqueName: \"kubernetes.io/projected/c2de137a-5cfb-4e83-bc12-a51456830ecb-kube-api-access-g9hxm\") pod \"machine-config-operator-74547568cd-9cntp\" (UID: \"c2de137a-5cfb-4e83-bc12-a51456830ecb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.943000 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm"] Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.948799 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.959087 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjpd5\" (UniqueName: \"kubernetes.io/projected/eb1aaaad-b069-4b70-87aa-51beaea830b3-kube-api-access-jjpd5\") pod \"machine-config-controller-84d6567774-8nv6b\" (UID: \"eb1aaaad-b069-4b70-87aa-51beaea830b3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.961423 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" event={"ID":"03c90169-1577-4751-bae4-ed1d7c19b416","Type":"ContainerStarted","Data":"1d9823d1ddcad23201546847ae6c726db3778429805c132b815e1bf00086cb88"} Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.964084 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" event={"ID":"1335413b-43df-4ec7-a45d-eb1094b8a125","Type":"ContainerStarted","Data":"0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c"} Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.964939 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.968862 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.982440 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46ttv\" (UniqueName: \"kubernetes.io/projected/27583c54-ec17-44a2-8240-224df02a4cbc-kube-api-access-46ttv\") pod \"collect-profiles-29329200-mltkw\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:55 crc kubenswrapper[4892]: I1006 12:10:55.992717 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:55.996477 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxljp\" (UniqueName: \"kubernetes.io/projected/dd66776b-8020-4b60-b147-fb605daea344-kube-api-access-dxljp\") pod \"dns-operator-744455d44c-f5gdg\" (UID: \"dd66776b-8020-4b60-b147-fb605daea344\") " pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.015416 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zdpfp"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.019063 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7cg\" (UniqueName: \"kubernetes.io/projected/4c9c567b-051e-4d81-9f50-f575b43b3a04-kube-api-access-6q7cg\") pod \"router-default-5444994796-psw22\" (UID: \"4c9c567b-051e-4d81-9f50-f575b43b3a04\") " pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.036045 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9kds\" (UniqueName: \"kubernetes.io/projected/aac1222e-f92a-4345-8ca2-125d2d2c2627-kube-api-access-k9kds\") pod \"marketplace-operator-79b997595-gj4rs\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.037550 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.055039 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7c2\" (UniqueName: \"kubernetes.io/projected/ab463181-efdc-4a78-b735-176516f4d185-kube-api-access-ws7c2\") pod \"control-plane-machine-set-operator-78cbb6b69f-4bs2h\" (UID: \"ab463181-efdc-4a78-b735-176516f4d185\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.077783 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/81b7b337-cd16-4231-bf36-f505d7d9afb5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-sl4cl\" (UID: \"81b7b337-cd16-4231-bf36-f505d7d9afb5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.094242 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.124995 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfvx8\" (UniqueName: \"kubernetes.io/projected/caa02b36-1e64-4845-ac7a-1d9ba48b1d18-kube-api-access-hfvx8\") pod \"migrator-59844c95c7-66c2c\" (UID: \"caa02b36-1e64-4845-ac7a-1d9ba48b1d18\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.151081 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.206382 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.207361 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.208438 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.208480 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-certificates\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.208508 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqvm9\" (UniqueName: \"kubernetes.io/projected/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-kube-api-access-jqvm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.208581 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp774\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-kube-api-access-dp774\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.208627 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q255f\" (UniqueName: \"kubernetes.io/projected/d26efdd9-e946-418f-95a6-0100f0364b92-kube-api-access-q255f\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.208653 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-service-ca\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.208761 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.208916 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.209018 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d15ec4b-09ec-427a-b002-a7293f363d8a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.209057 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d15ec4b-09ec-427a-b002-a7293f363d8a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.209084 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpvgx\" (UniqueName: \"kubernetes.io/projected/ecf3538a-312a-4485-933b-021f39fb9281-kube-api-access-dpvgx\") pod \"package-server-manager-789f6589d5-xp47t\" (UID: \"ecf3538a-312a-4485-933b-021f39fb9281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.209112 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcrsd\" (UniqueName: \"kubernetes.io/projected/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-kube-api-access-jcrsd\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.209193 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wc95\" (UniqueName: \"kubernetes.io/projected/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-kube-api-access-8wc95\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.209269 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.209296 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-metrics-tls\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.212050 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:56.712025812 +0000 UTC m=+143.261731587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.214832 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220394 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8pkt\" (UniqueName: \"kubernetes.io/projected/8a138d52-46a4-411d-a02e-8813b29d0ff5-kube-api-access-l8pkt\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220447 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-signing-key\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220474 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-oauth-config\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220496 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcbe3ede-971b-4d1a-8a9c-fc9a6185e541-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jwj8k\" (UID: \"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rbb\" (UniqueName: \"kubernetes.io/projected/bcbe3ede-971b-4d1a-8a9c-fc9a6185e541-kube-api-access-24rbb\") pod \"cluster-samples-operator-665b6dd947-jwj8k\" (UID: \"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220586 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-trusted-ca\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220625 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-serving-cert\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220646 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf3538a-312a-4485-933b-021f39fb9281-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xp47t\" (UID: \"ecf3538a-312a-4485-933b-021f39fb9281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220678 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220782 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-bound-sa-token\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220845 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-console-config\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220867 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-trusted-ca\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220886 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a138d52-46a4-411d-a02e-8813b29d0ff5-srv-cert\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220905 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a138d52-46a4-411d-a02e-8813b29d0ff5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.220969 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-trusted-ca-bundle\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.221012 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-oauth-serving-cert\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.221028 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-tls\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.233885 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.238912 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.239205 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cxk5j"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.239316 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qhnln"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.239478 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-f59vt"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.244537 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-74v25"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.260486 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.277520 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:56 crc kubenswrapper[4892]: W1006 12:10:56.278661 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5af32f0a_19b6_4fe5_9507_d50dd25053a3.slice/crio-7db2e4fef6ab4be4bb8c9046106aa64cdea4cb657c3e9c571d5d0437210b1b40 WatchSource:0}: Error finding container 7db2e4fef6ab4be4bb8c9046106aa64cdea4cb657c3e9c571d5d0437210b1b40: Status 404 returned error can't find the container with id 7db2e4fef6ab4be4bb8c9046106aa64cdea4cb657c3e9c571d5d0437210b1b40 Oct 06 12:10:56 crc kubenswrapper[4892]: W1006 12:10:56.280772 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2b7f5f5_c72f_4ffc_b2ab_b89aa8a9bbd2.slice/crio-8c31ed7c114930dbef63276cddc8ee0488d00590b2a41bb4d6efb6b677c10432 WatchSource:0}: Error finding container 8c31ed7c114930dbef63276cddc8ee0488d00590b2a41bb4d6efb6b677c10432: Status 404 returned error can't find the container with id 8c31ed7c114930dbef63276cddc8ee0488d00590b2a41bb4d6efb6b677c10432 Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.299960 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.322242 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.322712 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:56.822690197 +0000 UTC m=+143.372395962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323043 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-bound-sa-token\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323079 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-console-config\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323100 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5237ad21-39ce-43b8-a1da-bb226f255b13-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323119 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a138d52-46a4-411d-a02e-8813b29d0ff5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323136 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb2f8\" (UniqueName: \"kubernetes.io/projected/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-kube-api-access-vb2f8\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323151 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjg27\" (UniqueName: \"kubernetes.io/projected/474567d5-fe59-4883-a548-77a0217b490d-kube-api-access-rjg27\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323181 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-trusted-ca\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323195 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a138d52-46a4-411d-a02e-8813b29d0ff5-srv-cert\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323247 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-trusted-ca-bundle\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323264 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-oauth-serving-cert\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323282 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-mountpoint-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323296 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/474567d5-fe59-4883-a548-77a0217b490d-metrics-tls\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323313 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-tls\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323344 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4lhl\" (UniqueName: \"kubernetes.io/projected/54de0a22-2d75-4482-a8f4-b71063b9e356-kube-api-access-b4lhl\") pod \"ingress-canary-ghtz9\" (UID: \"54de0a22-2d75-4482-a8f4-b71063b9e356\") " pod="openshift-ingress-canary/ingress-canary-ghtz9" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323372 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/e0484ee5-9669-4442-ad6b-90cc850c81ea-kube-api-access-t67jv\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323398 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323435 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-certificates\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323457 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqvm9\" (UniqueName: \"kubernetes.io/projected/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-kube-api-access-jqvm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323497 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp774\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-kube-api-access-dp774\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323525 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d484383e-4760-4087-847f-644b007f5656-config\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323541 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jswlf\" (UniqueName: \"kubernetes.io/projected/d484383e-4760-4087-847f-644b007f5656-kube-api-access-jswlf\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323568 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-service-ca\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323603 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q255f\" (UniqueName: \"kubernetes.io/projected/d26efdd9-e946-418f-95a6-0100f0364b92-kube-api-access-q255f\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323620 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-registration-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323644 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54de0a22-2d75-4482-a8f4-b71063b9e356-cert\") pod \"ingress-canary-ghtz9\" (UID: \"54de0a22-2d75-4482-a8f4-b71063b9e356\") " pod="openshift-ingress-canary/ingress-canary-ghtz9" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323694 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323731 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcjl4\" (UniqueName: \"kubernetes.io/projected/ac237b76-87ec-461c-a07d-b9b979f96d75-kube-api-access-xcjl4\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323780 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-csi-data-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323806 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323848 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-certs\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323865 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5695d15c-0766-463e-8111-9bc66db72e77-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323969 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d15ec4b-09ec-427a-b002-a7293f363d8a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.323992 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d15ec4b-09ec-427a-b002-a7293f363d8a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324008 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpvgx\" (UniqueName: \"kubernetes.io/projected/ecf3538a-312a-4485-933b-021f39fb9281-kube-api-access-dpvgx\") pod \"package-server-manager-789f6589d5-xp47t\" (UID: \"ecf3538a-312a-4485-933b-021f39fb9281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324035 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0484ee5-9669-4442-ad6b-90cc850c81ea-webhook-cert\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324059 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcrsd\" (UniqueName: \"kubernetes.io/projected/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-kube-api-access-jcrsd\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324085 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wc95\" (UniqueName: \"kubernetes.io/projected/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-kube-api-access-8wc95\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324103 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5695d15c-0766-463e-8111-9bc66db72e77-config\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324130 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324165 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-metrics-tls\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324181 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/474567d5-fe59-4883-a548-77a0217b490d-config-volume\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324199 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8pkt\" (UniqueName: \"kubernetes.io/projected/8a138d52-46a4-411d-a02e-8813b29d0ff5-kube-api-access-l8pkt\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324214 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e0484ee5-9669-4442-ad6b-90cc850c81ea-tmpfs\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324238 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-signing-key\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324255 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-oauth-config\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324271 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-socket-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcbe3ede-971b-4d1a-8a9c-fc9a6185e541-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jwj8k\" (UID: \"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324309 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24rbb\" (UniqueName: \"kubernetes.io/projected/bcbe3ede-971b-4d1a-8a9c-fc9a6185e541-kube-api-access-24rbb\") pod \"cluster-samples-operator-665b6dd947-jwj8k\" (UID: \"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324347 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0484ee5-9669-4442-ad6b-90cc850c81ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324366 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-trusted-ca\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324385 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5237ad21-39ce-43b8-a1da-bb226f255b13-config\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324401 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-plugins-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324439 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5237ad21-39ce-43b8-a1da-bb226f255b13-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324454 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-node-bootstrap-token\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324469 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-serving-cert\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324486 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf3538a-312a-4485-933b-021f39fb9281-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xp47t\" (UID: \"ecf3538a-312a-4485-933b-021f39fb9281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324526 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324542 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5695d15c-0766-463e-8111-9bc66db72e77-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.324572 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d484383e-4760-4087-847f-644b007f5656-serving-cert\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.325402 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-console-config\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.325418 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-trusted-ca-bundle\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.327235 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.328773 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:56.828756536 +0000 UTC m=+143.378462301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.330492 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-service-ca\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.330931 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d15ec4b-09ec-427a-b002-a7293f363d8a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.332183 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-trusted-ca\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.333095 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-certificates\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.333272 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-signing-cabundle\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.334576 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-tls\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.334701 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-oauth-serving-cert\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.335082 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-trusted-ca\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.337682 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-signing-key\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.339461 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bcbe3ede-971b-4d1a-8a9c-fc9a6185e541-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jwj8k\" (UID: \"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.340769 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-serving-cert\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.342119 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-oauth-config\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.344333 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a138d52-46a4-411d-a02e-8813b29d0ff5-srv-cert\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.347169 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d15ec4b-09ec-427a-b002-a7293f363d8a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.350257 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.350682 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-metrics-tls\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.354141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a138d52-46a4-411d-a02e-8813b29d0ff5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.354687 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf3538a-312a-4485-933b-021f39fb9281-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xp47t\" (UID: \"ecf3538a-312a-4485-933b-021f39fb9281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.371972 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-bound-sa-token\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.395382 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wc95\" (UniqueName: \"kubernetes.io/projected/e7208801-52e6-44b2-8fe8-508fcdcfc9c0-kube-api-access-8wc95\") pod \"service-ca-9c57cc56f-kgnbr\" (UID: \"e7208801-52e6-44b2-8fe8-508fcdcfc9c0\") " pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.398382 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.411343 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.412194 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcrsd\" (UniqueName: \"kubernetes.io/projected/2bbf72a0-ca1f-43e9-a0eb-56bed4086b97-kube-api-access-jcrsd\") pod \"ingress-operator-5b745b69d9-8q55t\" (UID: \"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.425710 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.425942 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-mountpoint-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.425966 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4lhl\" (UniqueName: \"kubernetes.io/projected/54de0a22-2d75-4482-a8f4-b71063b9e356-kube-api-access-b4lhl\") pod \"ingress-canary-ghtz9\" (UID: \"54de0a22-2d75-4482-a8f4-b71063b9e356\") " pod="openshift-ingress-canary/ingress-canary-ghtz9" Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.426047 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:56.925963919 +0000 UTC m=+143.475669694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426132 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/474567d5-fe59-4883-a548-77a0217b490d-metrics-tls\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426144 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-mountpoint-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426421 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/e0484ee5-9669-4442-ad6b-90cc850c81ea-kube-api-access-t67jv\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426528 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d484383e-4760-4087-847f-644b007f5656-config\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426580 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jswlf\" (UniqueName: \"kubernetes.io/projected/d484383e-4760-4087-847f-644b007f5656-kube-api-access-jswlf\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426616 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-registration-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426796 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54de0a22-2d75-4482-a8f4-b71063b9e356-cert\") pod \"ingress-canary-ghtz9\" (UID: \"54de0a22-2d75-4482-a8f4-b71063b9e356\") " pod="openshift-ingress-canary/ingress-canary-ghtz9" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426888 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.426957 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcjl4\" (UniqueName: \"kubernetes.io/projected/ac237b76-87ec-461c-a07d-b9b979f96d75-kube-api-access-xcjl4\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427526 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-csi-data-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-certs\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427595 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5695d15c-0766-463e-8111-9bc66db72e77-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427639 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0484ee5-9669-4442-ad6b-90cc850c81ea-webhook-cert\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427662 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5695d15c-0766-463e-8111-9bc66db72e77-config\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427686 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/474567d5-fe59-4883-a548-77a0217b490d-config-volume\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427707 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e0484ee5-9669-4442-ad6b-90cc850c81ea-tmpfs\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427733 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-socket-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427749 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0484ee5-9669-4442-ad6b-90cc850c81ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427775 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5237ad21-39ce-43b8-a1da-bb226f255b13-config\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427801 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-plugins-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427820 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5237ad21-39ce-43b8-a1da-bb226f255b13-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427834 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-node-bootstrap-token\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427853 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5695d15c-0766-463e-8111-9bc66db72e77-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427874 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d484383e-4760-4087-847f-644b007f5656-serving-cert\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427896 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5237ad21-39ce-43b8-a1da-bb226f255b13-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427912 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjg27\" (UniqueName: \"kubernetes.io/projected/474567d5-fe59-4883-a548-77a0217b490d-kube-api-access-rjg27\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.427931 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb2f8\" (UniqueName: \"kubernetes.io/projected/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-kube-api-access-vb2f8\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.428504 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-csi-data-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.428792 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5695d15c-0766-463e-8111-9bc66db72e77-config\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.428934 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-registration-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.429104 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/474567d5-fe59-4883-a548-77a0217b490d-config-volume\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.429299 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d484383e-4760-4087-847f-644b007f5656-config\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.429394 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-plugins-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.429784 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e0484ee5-9669-4442-ad6b-90cc850c81ea-tmpfs\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.429836 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac237b76-87ec-461c-a07d-b9b979f96d75-socket-dir\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.430693 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/54de0a22-2d75-4482-a8f4-b71063b9e356-cert\") pod \"ingress-canary-ghtz9\" (UID: \"54de0a22-2d75-4482-a8f4-b71063b9e356\") " pod="openshift-ingress-canary/ingress-canary-ghtz9" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.431151 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5237ad21-39ce-43b8-a1da-bb226f255b13-config\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.431199 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:56.931158643 +0000 UTC m=+143.480864488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.442250 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.446427 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d484383e-4760-4087-847f-644b007f5656-serving-cert\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.447010 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-node-bootstrap-token\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.447755 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e0484ee5-9669-4442-ad6b-90cc850c81ea-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.449062 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rbb\" (UniqueName: \"kubernetes.io/projected/bcbe3ede-971b-4d1a-8a9c-fc9a6185e541-kube-api-access-24rbb\") pod \"cluster-samples-operator-665b6dd947-jwj8k\" (UID: \"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.449353 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5695d15c-0766-463e-8111-9bc66db72e77-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.449597 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/474567d5-fe59-4883-a548-77a0217b490d-metrics-tls\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.449710 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e0484ee5-9669-4442-ad6b-90cc850c81ea-webhook-cert\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.450730 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-certs\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.464092 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.465651 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5237ad21-39ce-43b8-a1da-bb226f255b13-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.466315 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q255f\" (UniqueName: \"kubernetes.io/projected/d26efdd9-e946-418f-95a6-0100f0364b92-kube-api-access-q255f\") pod \"console-f9d7485db-g2wtm\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: W1006 12:10:56.488339 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6f15a1_6936_485d_9788_4ac1eb111cc2.slice/crio-932ba501e944e2fa569b53feff1cae9a09d75c4bac5cdd0a6b4a61ee86afbd19 WatchSource:0}: Error finding container 932ba501e944e2fa569b53feff1cae9a09d75c4bac5cdd0a6b4a61ee86afbd19: Status 404 returned error can't find the container with id 932ba501e944e2fa569b53feff1cae9a09d75c4bac5cdd0a6b4a61ee86afbd19 Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.509194 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.512425 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.522066 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.528516 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.529067 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.029053484 +0000 UTC m=+143.578759249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.535921 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqvm9\" (UniqueName: \"kubernetes.io/projected/e0abff23-83fc-40fc-ba54-55ac7cc6c5cf-kube-api-access-jqvm9\") pod \"kube-storage-version-migrator-operator-b67b599dd-7gwd5\" (UID: \"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.548015 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpvgx\" (UniqueName: \"kubernetes.io/projected/ecf3538a-312a-4485-933b-021f39fb9281-kube-api-access-dpvgx\") pod \"package-server-manager-789f6589d5-xp47t\" (UID: \"ecf3538a-312a-4485-933b-021f39fb9281\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.566474 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8pkt\" (UniqueName: \"kubernetes.io/projected/8a138d52-46a4-411d-a02e-8813b29d0ff5-kube-api-access-l8pkt\") pod \"olm-operator-6b444d44fb-k6xp7\" (UID: \"8a138d52-46a4-411d-a02e-8813b29d0ff5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.567452 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp774\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-kube-api-access-dp774\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.577700 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.583982 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4lhl\" (UniqueName: \"kubernetes.io/projected/54de0a22-2d75-4482-a8f4-b71063b9e356-kube-api-access-b4lhl\") pod \"ingress-canary-ghtz9\" (UID: \"54de0a22-2d75-4482-a8f4-b71063b9e356\") " pod="openshift-ingress-canary/ingress-canary-ghtz9" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.588646 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d86rc"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.594374 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb2f8\" (UniqueName: \"kubernetes.io/projected/60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d-kube-api-access-vb2f8\") pod \"machine-config-server-fwwjc\" (UID: \"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d\") " pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.610193 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.610495 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" Oct 06 12:10:56 crc kubenswrapper[4892]: W1006 12:10:56.624782 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07272b5f_3586_4998_8ae8_6b0365531863.slice/crio-f761014696fbd8e7193a8821c0a29134140998a366692868c627444dd176c38e WatchSource:0}: Error finding container f761014696fbd8e7193a8821c0a29134140998a366692868c627444dd176c38e: Status 404 returned error can't find the container with id f761014696fbd8e7193a8821c0a29134140998a366692868c627444dd176c38e Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.630368 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.630691 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.130679728 +0000 UTC m=+143.680385493 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.635614 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fwwjc" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.640413 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcjl4\" (UniqueName: \"kubernetes.io/projected/ac237b76-87ec-461c-a07d-b9b979f96d75-kube-api-access-xcjl4\") pod \"csi-hostpathplugin-jdnln\" (UID: \"ac237b76-87ec-461c-a07d-b9b979f96d75\") " pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.651440 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jswlf\" (UniqueName: \"kubernetes.io/projected/d484383e-4760-4087-847f-644b007f5656-kube-api-access-jswlf\") pod \"service-ca-operator-777779d784-gf8hm\" (UID: \"d484383e-4760-4087-847f-644b007f5656\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.673808 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/e0484ee5-9669-4442-ad6b-90cc850c81ea-kube-api-access-t67jv\") pod \"packageserver-d55dfcdfc-7jkw5\" (UID: \"e0484ee5-9669-4442-ad6b-90cc850c81ea\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.675358 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jdnln" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.675717 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5237ad21-39ce-43b8-a1da-bb226f255b13-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-46gnn\" (UID: \"5237ad21-39ce-43b8-a1da-bb226f255b13\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.684025 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ghtz9" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.692847 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5695d15c-0766-463e-8111-9bc66db72e77-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-z9b97\" (UID: \"5695d15c-0766-463e-8111-9bc66db72e77\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: W1006 12:10:56.697958 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f3f423a_0bf7_413b_b277_5d8c5dcec3d0.slice/crio-859655dcb6958122e98674cedc0e6bbd63da3441a9f08c84fe1ea9f0d6ac1ebd WatchSource:0}: Error finding container 859655dcb6958122e98674cedc0e6bbd63da3441a9f08c84fe1ea9f0d6ac1ebd: Status 404 returned error can't find the container with id 859655dcb6958122e98674cedc0e6bbd63da3441a9f08c84fe1ea9f0d6ac1ebd Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.733473 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjg27\" (UniqueName: \"kubernetes.io/projected/474567d5-fe59-4883-a548-77a0217b490d-kube-api-access-rjg27\") pod \"dns-default-hfjg7\" (UID: \"474567d5-fe59-4883-a548-77a0217b490d\") " pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.739828 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.740302 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.24028709 +0000 UTC m=+143.789992855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.834842 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.841857 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.843241 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.343229679 +0000 UTC m=+143.892935444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.854695 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.854807 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.892006 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" podStartSLOduration=121.891987191 podStartE2EDuration="2m1.891987191s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:56.88782552 +0000 UTC m=+143.437531285" watchObservedRunningTime="2025-10-06 12:10:56.891987191 +0000 UTC m=+143.441692956" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.918382 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.923501 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.931656 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.934986 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9q95"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.941846 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-knk7q"] Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.943279 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:56 crc kubenswrapper[4892]: E1006 12:10:56.943567 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.443550819 +0000 UTC m=+143.993256584 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.944395 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.948860 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hfjg7" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.983171 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" event={"ID":"5af32f0a-19b6-4fe5-9507-d50dd25053a3","Type":"ContainerStarted","Data":"5b8ec5afb7aa043c7495910bd4b8ec3d93050b45a48fb3ea45b1933cb93cc905"} Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.983205 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" event={"ID":"5af32f0a-19b6-4fe5-9507-d50dd25053a3","Type":"ContainerStarted","Data":"7db2e4fef6ab4be4bb8c9046106aa64cdea4cb657c3e9c571d5d0437210b1b40"} Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.990280 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" event={"ID":"e5f891dd-b8d4-4ecc-940b-f41218193a8b","Type":"ContainerStarted","Data":"fdc6bc82ec3516fa2ef184a8e4c516d3af61cb5b78365fcc3abd97f139f7b8a4"} Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.991536 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b646909-419f-466d-84d8-0ccd08567b52" containerID="48de77118c19ebafcc869016df92d8c27ff43db8060af6872b57252156956013" exitCode=0 Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.991631 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" event={"ID":"9b646909-419f-466d-84d8-0ccd08567b52","Type":"ContainerDied","Data":"48de77118c19ebafcc869016df92d8c27ff43db8060af6872b57252156956013"} Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.991672 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" event={"ID":"9b646909-419f-466d-84d8-0ccd08567b52","Type":"ContainerStarted","Data":"d272c2496a5e418c2c9ced79d3de918c56d3e1e7a8a0f1831832cec74d37f8b0"} Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.992984 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zdpfp" event={"ID":"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb","Type":"ContainerStarted","Data":"6255c8e295528b8322012f345b0bf5f2ee20baf4ff56a02058a9f126b3217722"} Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.993003 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zdpfp" event={"ID":"4a468eca-e75e-4bcb-84f4-27a4cc01e4cb","Type":"ContainerStarted","Data":"aeb249a5a04db0dae69b57366c119637e4da6099361f7129c737bdb71ef84055"} Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.993567 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.995524 4892 patch_prober.go:28] interesting pod/console-operator-58897d9998-zdpfp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 06 12:10:56 crc kubenswrapper[4892]: I1006 12:10:56.995552 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zdpfp" podUID="4a468eca-e75e-4bcb-84f4-27a4cc01e4cb" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/readyz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.017301 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" event={"ID":"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7","Type":"ContainerStarted","Data":"2e79558a02b9de694c191493dd3fb32ea59df24efeb29487e1519786022c4eea"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.045991 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.046773 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.546759459 +0000 UTC m=+144.096465224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.065198 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" event={"ID":"3f6f15a1-6936-485d-9788-4ac1eb111cc2","Type":"ContainerStarted","Data":"932ba501e944e2fa569b53feff1cae9a09d75c4bac5cdd0a6b4a61ee86afbd19"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.092846 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" event={"ID":"f130811c-c622-4ee1-994b-cda735eaaf41","Type":"ContainerStarted","Data":"8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.092887 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" event={"ID":"f130811c-c622-4ee1-994b-cda735eaaf41","Type":"ContainerStarted","Data":"a12ce925db6906c0ebe7074a79fe12ab2084dcd091196c8156f7623832951523"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.093695 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.104849 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" event={"ID":"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0","Type":"ContainerStarted","Data":"859655dcb6958122e98674cedc0e6bbd63da3441a9f08c84fe1ea9f0d6ac1ebd"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.132753 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.133239 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qhnln" event={"ID":"b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2","Type":"ContainerStarted","Data":"50366e44a74e9474bfe6dc533ae31e9452ae29e7ffb574feed84e07be9e67c27"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.133282 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qhnln" event={"ID":"b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2","Type":"ContainerStarted","Data":"8c31ed7c114930dbef63276cddc8ee0488d00590b2a41bb4d6efb6b677c10432"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.134482 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qhnln" Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.138858 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fwwjc" event={"ID":"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d","Type":"ContainerStarted","Data":"0e1961316f2a48098cefb69281b9759438cb787825095662e1a9d214fc26ec3b"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.145378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" event={"ID":"07272b5f-3586-4998-8ae8-6b0365531863","Type":"ContainerStarted","Data":"f761014696fbd8e7193a8821c0a29134140998a366692868c627444dd176c38e"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.146309 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-f5gdg"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.147927 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.147962 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl"] Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.148072 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.64805852 +0000 UTC m=+144.197764285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.148227 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.149909 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.649894155 +0000 UTC m=+144.199599920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.153424 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9zbfw"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.169406 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-psw22" event={"ID":"4c9c567b-051e-4d81-9f50-f575b43b3a04","Type":"ContainerStarted","Data":"2001b27e3a12550f4aab5bde36590a3edeaa1f128998952bacce182b56346717"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.194625 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.194717 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" event={"ID":"03c90169-1577-4751-bae4-ed1d7c19b416","Type":"ContainerStarted","Data":"81f0328018ac26dc35f05166c358e714d70bb4e85304e8f4230ebf50dbc0c774"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.202437 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" event={"ID":"3d64b83b-4248-4a44-ada4-8fd3439d4d54","Type":"ContainerStarted","Data":"132e89676d445aadede9409ba3076ec99b8c3fa50ea1756341ed7b5abccdc2eb"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.224687 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" event={"ID":"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4","Type":"ContainerStarted","Data":"8483f5558667c293ebb57fb311fa1400bfdba32c2be0c0ebd4d3fbbe31004e09"} Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.231713 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-qhnln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.232342 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qhnln" podUID="b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.249435 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.251550 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.75153155 +0000 UTC m=+144.301237315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.285148 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b"] Oct 06 12:10:57 crc kubenswrapper[4892]: W1006 12:10:57.298958 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab463181_efdc_4a78_b735_176516f4d185.slice/crio-643527081a01f74b2ad2a32c600fc27edab5c7b94a3d770a9934926ee66d4a97 WatchSource:0}: Error finding container 643527081a01f74b2ad2a32c600fc27edab5c7b94a3d770a9934926ee66d4a97: Status 404 returned error can't find the container with id 643527081a01f74b2ad2a32c600fc27edab5c7b94a3d770a9934926ee66d4a97 Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.357029 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.360978 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.860963285 +0000 UTC m=+144.410669050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.361735 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4rs"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.427224 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.460547 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.462894 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:57.962878201 +0000 UTC m=+144.512583966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.555646 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g2wtm"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.563715 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.564460 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.064446713 +0000 UTC m=+144.614152468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.664461 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.665131 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.165108858 +0000 UTC m=+144.714814623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.665237 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.665608 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.165597438 +0000 UTC m=+144.715303203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: W1006 12:10:57.683872 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26efdd9_e946_418f_95a6_0100f0364b92.slice/crio-6a9cc930a2d45034665cfc9fab2c236e12427cad98218fbcd6a694cb2fbc93b9 WatchSource:0}: Error finding container 6a9cc930a2d45034665cfc9fab2c236e12427cad98218fbcd6a694cb2fbc93b9: Status 404 returned error can't find the container with id 6a9cc930a2d45034665cfc9fab2c236e12427cad98218fbcd6a694cb2fbc93b9 Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.765942 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.766453 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.266433499 +0000 UTC m=+144.816139274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.864475 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.869752 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.870067 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.370056146 +0000 UTC m=+144.919761901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.873622 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ghtz9"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.911427 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.924079 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.971148 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.971872 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.471850257 +0000 UTC m=+145.021556022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.971944 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:57 crc kubenswrapper[4892]: E1006 12:10:57.981151 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.481132948 +0000 UTC m=+145.030838713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.984693 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kgnbr"] Oct 06 12:10:57 crc kubenswrapper[4892]: I1006 12:10:57.985029 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jdnln"] Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.017881 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97"] Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.026195 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7"] Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.026236 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t"] Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.026261 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t"] Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.084196 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.085683 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.585662152 +0000 UTC m=+145.135367917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.094463 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hfjg7"] Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.097635 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5"] Oct 06 12:10:58 crc kubenswrapper[4892]: W1006 12:10:58.132920 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf3538a_312a_4485_933b_021f39fb9281.slice/crio-08a902156221910c00fda2d0e6f873e5d5b0d25233d6e7537d10b0a3c8380ed8 WatchSource:0}: Error finding container 08a902156221910c00fda2d0e6f873e5d5b0d25233d6e7537d10b0a3c8380ed8: Status 404 returned error can't find the container with id 08a902156221910c00fda2d0e6f873e5d5b0d25233d6e7537d10b0a3c8380ed8 Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.183968 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn"] Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.188412 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.188703 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.688690584 +0000 UTC m=+145.238396349 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.189931 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm"] Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.240481 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ghtz9" event={"ID":"54de0a22-2d75-4482-a8f4-b71063b9e356","Type":"ContainerStarted","Data":"899e34ecbf0850d10bd13fe4b2787ccb788cf0dcbcc443a8007e6b038bcb4a33"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.242551 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" event={"ID":"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf","Type":"ContainerStarted","Data":"4eb36b5b248821a804881b480cb5b4e1e5bfc607826f08a43f0114bb3c3d31bf"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.261874 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" event={"ID":"e7208801-52e6-44b2-8fe8-508fcdcfc9c0","Type":"ContainerStarted","Data":"0ab13135a826cf54ae719d4820d2d168fe322cfb2605e96d3c21bab985840975"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.267083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" event={"ID":"099dffdf-1bf9-451d-8248-d4104dcdf1b6","Type":"ContainerStarted","Data":"c5a8bdaa4617dd43069e9d6857f684d8c032ca689a79a3c63ebf29f4880b42e2"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.267119 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" event={"ID":"099dffdf-1bf9-451d-8248-d4104dcdf1b6","Type":"ContainerStarted","Data":"e654ed95f53e7fb8659a737f272026820a151be51e5e555ea3f270e94b662e4a"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.268151 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" event={"ID":"caa02b36-1e64-4845-ac7a-1d9ba48b1d18","Type":"ContainerStarted","Data":"ab3c772b3a0789f4823320568949b28a1bfe5f1976d9df243f58409f73293f82"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.269006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" event={"ID":"07272b5f-3586-4998-8ae8-6b0365531863","Type":"ContainerStarted","Data":"7802288ebf7df6ffbad2a4c1961c1ea58a64fb1112ae324ec85cd3f351075c7c"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.270126 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" event={"ID":"81b7b337-cd16-4231-bf36-f505d7d9afb5","Type":"ContainerStarted","Data":"5af245df62e33941930c31cbf123db4180796311f292e13db33801eecb005d7e"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.270801 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" event={"ID":"ecf3538a-312a-4485-933b-021f39fb9281","Type":"ContainerStarted","Data":"08a902156221910c00fda2d0e6f873e5d5b0d25233d6e7537d10b0a3c8380ed8"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.271610 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" event={"ID":"34dfebbb-7912-4c8b-ba61-8d7c6cd6edb7","Type":"ContainerStarted","Data":"30b558816fcfcccd930098e8bde54365e66fab2cbaac9475869da873c16020db"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.272939 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" podStartSLOduration=123.272929304 podStartE2EDuration="2m3.272929304s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.211097294 +0000 UTC m=+144.760803059" watchObservedRunningTime="2025-10-06 12:10:58.272929304 +0000 UTC m=+144.822635069" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.281113 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" event={"ID":"aac1222e-f92a-4345-8ca2-125d2d2c2627","Type":"ContainerStarted","Data":"3077d4497f24bb3b7f9ab8599ab5676a90ddd14142211a62d5e15a0f61fa361c"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.284219 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" event={"ID":"ab463181-efdc-4a78-b735-176516f4d185","Type":"ContainerStarted","Data":"e2ef519e40b207e6998ac9a83923c2469b1a9e440a8953f2f5ac157eeeace3cf"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.284258 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" event={"ID":"ab463181-efdc-4a78-b735-176516f4d185","Type":"ContainerStarted","Data":"643527081a01f74b2ad2a32c600fc27edab5c7b94a3d770a9934926ee66d4a97"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.286028 4892 generic.go:334] "Generic (PLEG): container finished" podID="5af32f0a-19b6-4fe5-9507-d50dd25053a3" containerID="5b8ec5afb7aa043c7495910bd4b8ec3d93050b45a48fb3ea45b1933cb93cc905" exitCode=0 Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.286079 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" event={"ID":"5af32f0a-19b6-4fe5-9507-d50dd25053a3","Type":"ContainerDied","Data":"5b8ec5afb7aa043c7495910bd4b8ec3d93050b45a48fb3ea45b1933cb93cc905"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.290701 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.291434 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.791414873 +0000 UTC m=+145.341120628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.294956 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" event={"ID":"e0484ee5-9669-4442-ad6b-90cc850c81ea","Type":"ContainerStarted","Data":"25056754c286439f5c11bb2b90a24616f65619067ff48f820bf00d7b8331977b"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.295882 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" event={"ID":"d6cd9565-520a-47d6-bb93-7423147863ef","Type":"ContainerStarted","Data":"a949af274614a59b0c009b2e3c84e712848fd14a10d454b5ed09880b3b9b0713"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.295907 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" event={"ID":"d6cd9565-520a-47d6-bb93-7423147863ef","Type":"ContainerStarted","Data":"ae399727ed570f0fdaab4edc55dc2108195f40d7c47b28beaeb528cc68dc6547"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.296270 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.297428 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" event={"ID":"9b646909-419f-466d-84d8-0ccd08567b52","Type":"ContainerStarted","Data":"53510130dc43440b740f72fc23b10a8977e938a9d789a6cb493818b1d9781dab"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.303864 4892 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-h9q95 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.304224 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" podUID="d6cd9565-520a-47d6-bb93-7423147863ef" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.304678 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" event={"ID":"27583c54-ec17-44a2-8240-224df02a4cbc","Type":"ContainerStarted","Data":"df7c3cdb8251a98ff34a4de6decf8fbe7f743e3388925f7c2f38736e85756ee8"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.304704 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" event={"ID":"27583c54-ec17-44a2-8240-224df02a4cbc","Type":"ContainerStarted","Data":"be34ce40aa3212494341a3ac8983a2089ad3681d081b23c5476911ea768e0b70"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.309999 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jdnln" event={"ID":"ac237b76-87ec-461c-a07d-b9b979f96d75","Type":"ContainerStarted","Data":"331f9b427c3847ec6273be391be9320218ce130025b6deafb7e20b70ae9234b2"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.316478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" event={"ID":"5237ad21-39ce-43b8-a1da-bb226f255b13","Type":"ContainerStarted","Data":"bef7dc2f0bcf9dbce0f554136f757cafc06c5190d9721b69b7f75840fa9f6e61"} Oct 06 12:10:58 crc kubenswrapper[4892]: W1006 12:10:58.317698 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd484383e_4760_4087_847f_644b007f5656.slice/crio-952ddbc9dc050c13b4425db2ef58ae147bfeb8d1b4c43a45f632c3c97c719f15 WatchSource:0}: Error finding container 952ddbc9dc050c13b4425db2ef58ae147bfeb8d1b4c43a45f632c3c97c719f15: Status 404 returned error can't find the container with id 952ddbc9dc050c13b4425db2ef58ae147bfeb8d1b4c43a45f632c3c97c719f15 Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.318740 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" event={"ID":"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1","Type":"ContainerStarted","Data":"bb1f7dc8d5f5e1aaccabb14d056ec882b6a24f9f7c65d65eafd1c01e8be7ab3c"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.324836 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qhnln" podStartSLOduration=123.324825036 podStartE2EDuration="2m3.324825036s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.323430408 +0000 UTC m=+144.873136173" watchObservedRunningTime="2025-10-06 12:10:58.324825036 +0000 UTC m=+144.874530801" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.334689 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fwwjc" event={"ID":"60e06cdf-2bfe-4ce6-82b9-49915b5f8b1d","Type":"ContainerStarted","Data":"4f5ab24eec87c44fcf4ca640e355c442718de89cafd703504574a22f2d7847fb"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.337958 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" event={"ID":"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97","Type":"ContainerStarted","Data":"0446217a34d2db5c50b3cc0086b49d89c77a2c532d4c79d39ff282e5417e0531"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.339953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" event={"ID":"8a138d52-46a4-411d-a02e-8813b29d0ff5","Type":"ContainerStarted","Data":"bf232613d7e10d3c344c217844d857188191514616ca9fe84bc2d1e51ca793ab"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.356382 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" event={"ID":"7f3f423a-0bf7-413b-b277-5d8c5dcec3d0","Type":"ContainerStarted","Data":"c89046c3936ecd37b12c1de4dd918035b846103da8e6be5fed3bd52955953507"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.365253 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" event={"ID":"3f6f15a1-6936-485d-9788-4ac1eb111cc2","Type":"ContainerStarted","Data":"8ba726d44deda9491de5a270071feffde2fdb2c55e23438d8ce27263b4b484ad"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.367363 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g2wtm" event={"ID":"d26efdd9-e946-418f-95a6-0100f0364b92","Type":"ContainerStarted","Data":"6a9cc930a2d45034665cfc9fab2c236e12427cad98218fbcd6a694cb2fbc93b9"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.376416 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" event={"ID":"5695d15c-0766-463e-8111-9bc66db72e77","Type":"ContainerStarted","Data":"5d9ac7dfd1a75df3d94735ed1265721ed07739514e023d1ed022b2f5e0a15ef3"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.380736 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-psw22" event={"ID":"4c9c567b-051e-4d81-9f50-f575b43b3a04","Type":"ContainerStarted","Data":"027dae41aa7d5a726bad2ae441efdd46d13a38d4983d7280f7244df12d81cb18"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.382888 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zdpfp" podStartSLOduration=123.38287658 podStartE2EDuration="2m3.38287658s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.3446533 +0000 UTC m=+144.894359065" watchObservedRunningTime="2025-10-06 12:10:58.38287658 +0000 UTC m=+144.932582345" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.392313 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.393471 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" event={"ID":"03c90169-1577-4751-bae4-ed1d7c19b416","Type":"ContainerStarted","Data":"5d8746539dcaf8131e9acf9aa1c97b6929164d4c174a53b3743d3b4bb07720a4"} Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.394351 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.894335831 +0000 UTC m=+145.444041596 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.406741 4892 generic.go:334] "Generic (PLEG): container finished" podID="e5f891dd-b8d4-4ecc-940b-f41218193a8b" containerID="329a6019fefa397345bbef6a7f03a87233da90ed4a3f5aa499f24f5f85bc60fd" exitCode=0 Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.406802 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" event={"ID":"e5f891dd-b8d4-4ecc-940b-f41218193a8b","Type":"ContainerDied","Data":"329a6019fefa397345bbef6a7f03a87233da90ed4a3f5aa499f24f5f85bc60fd"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.424036 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" event={"ID":"3d64b83b-4248-4a44-ada4-8fd3439d4d54","Type":"ContainerStarted","Data":"fbf893598115db7e5b4acb5246b88ca85c640a35999fd0fbe624e94ec0f1ac2d"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.425207 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.428676 4892 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-cqq89 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.428708 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" podUID="3d64b83b-4248-4a44-ada4-8fd3439d4d54" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.433767 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hfjg7" event={"ID":"474567d5-fe59-4883-a548-77a0217b490d","Type":"ContainerStarted","Data":"b2fda84216d91abe6c9904d438f3b05f31bbd761ee56273f33b0f8d28c75bab1"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.435514 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-d86rc" podStartSLOduration=123.435505302 podStartE2EDuration="2m3.435505302s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.431058539 +0000 UTC m=+144.980764304" watchObservedRunningTime="2025-10-06 12:10:58.435505302 +0000 UTC m=+144.985211067" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.455338 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" event={"ID":"eb1aaaad-b069-4b70-87aa-51beaea830b3","Type":"ContainerStarted","Data":"a2f7bb8d68c31c2e912ed5289141f46efae95ff834abab6a2eeeb4f787326eae"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.458493 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" podStartSLOduration=123.458480496 podStartE2EDuration="2m3.458480496s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.458147422 +0000 UTC m=+145.007853187" watchObservedRunningTime="2025-10-06 12:10:58.458480496 +0000 UTC m=+145.008186261" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.461135 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" event={"ID":"dd66776b-8020-4b60-b147-fb605daea344","Type":"ContainerStarted","Data":"66250465cd38ffc538139946331c70003518f97fcf55a1b9e7e3107f2e7b7ba4"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.464613 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" event={"ID":"98c007a1-0ab1-45e2-8f5e-8f7bfdb9b5f4","Type":"ContainerStarted","Data":"860b54f4ff9f3e378c058271f2b263c3db49af5fc54b8b29ea742543f3655380"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.482861 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-qhnln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.482907 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qhnln" podUID="b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.484110 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" event={"ID":"c2de137a-5cfb-4e83-bc12-a51456830ecb","Type":"ContainerStarted","Data":"9d4648ae654e5cdd7526cf09bd2ae570bc8f5d5db431bae68bff2748a90b699a"} Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.489892 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zdpfp" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.498406 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.498548 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:58.998528691 +0000 UTC m=+145.548234456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.498818 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.501092 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.001076245 +0000 UTC m=+145.550782010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.530620 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-psw22" podStartSLOduration=123.530601888 podStartE2EDuration="2m3.530601888s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.530195031 +0000 UTC m=+145.079900796" watchObservedRunningTime="2025-10-06 12:10:58.530601888 +0000 UTC m=+145.080307653" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.532563 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" podStartSLOduration=123.532541688 podStartE2EDuration="2m3.532541688s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.495769557 +0000 UTC m=+145.045475322" watchObservedRunningTime="2025-10-06 12:10:58.532541688 +0000 UTC m=+145.082247453" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.573226 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-fwwjc" podStartSLOduration=5.573211578 podStartE2EDuration="5.573211578s" podCreationTimestamp="2025-10-06 12:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.571388923 +0000 UTC m=+145.121094688" watchObservedRunningTime="2025-10-06 12:10:58.573211578 +0000 UTC m=+145.122917343" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.599990 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.601310 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.101288501 +0000 UTC m=+145.650994266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.613032 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" podStartSLOduration=124.613018063 podStartE2EDuration="2m4.613018063s" podCreationTimestamp="2025-10-06 12:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.611522152 +0000 UTC m=+145.161227917" watchObservedRunningTime="2025-10-06 12:10:58.613018063 +0000 UTC m=+145.162723828" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.704125 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.704445 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.204434328 +0000 UTC m=+145.754140093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.741502 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-nzqh4" podStartSLOduration=123.74148371 podStartE2EDuration="2m3.74148371s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.693081512 +0000 UTC m=+145.242787287" watchObservedRunningTime="2025-10-06 12:10:58.74148371 +0000 UTC m=+145.291189475" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.770704 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-74v25" podStartSLOduration=123.77068714 podStartE2EDuration="2m3.77068714s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.739941187 +0000 UTC m=+145.289646962" watchObservedRunningTime="2025-10-06 12:10:58.77068714 +0000 UTC m=+145.320392905" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.773778 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-pfsfk" podStartSLOduration=123.773772446 podStartE2EDuration="2m3.773772446s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.767925446 +0000 UTC m=+145.317631201" watchObservedRunningTime="2025-10-06 12:10:58.773772446 +0000 UTC m=+145.323478211" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.804787 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.804950 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.304929196 +0000 UTC m=+145.854634961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.805054 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.805383 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.305371354 +0000 UTC m=+145.855077119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.906658 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:58 crc kubenswrapper[4892]: E1006 12:10:58.906950 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.406935876 +0000 UTC m=+145.956641641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.932097 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" podStartSLOduration=123.932079749 podStartE2EDuration="2m3.932079749s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.886676024 +0000 UTC m=+145.436381789" watchObservedRunningTime="2025-10-06 12:10:58.932079749 +0000 UTC m=+145.481785514" Oct 06 12:10:58 crc kubenswrapper[4892]: I1006 12:10:58.976022 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-w9qcd" podStartSLOduration=123.976000112 podStartE2EDuration="2m3.976000112s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.932696014 +0000 UTC m=+145.482401789" watchObservedRunningTime="2025-10-06 12:10:58.976000112 +0000 UTC m=+145.525705887" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.009009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.009390 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.509375173 +0000 UTC m=+146.059080948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.110550 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.110731 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.610704865 +0000 UTC m=+146.160410640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.111149 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.111506 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.611490007 +0000 UTC m=+146.161195772 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.213074 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.214112 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.714097952 +0000 UTC m=+146.263803707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.280048 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.282554 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.282622 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.316163 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.316607 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.816589492 +0000 UTC m=+146.366295257 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.380651 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.404927 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9f9mr" podStartSLOduration=125.404908259 podStartE2EDuration="2m5.404908259s" podCreationTimestamp="2025-10-06 12:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:58.971252627 +0000 UTC m=+145.520958392" watchObservedRunningTime="2025-10-06 12:10:59.404908259 +0000 UTC m=+145.954614024" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.417852 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.418038 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.918014788 +0000 UTC m=+146.467720553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.418772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.419301 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:10:59.91928121 +0000 UTC m=+146.468986975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.490557 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" event={"ID":"5af32f0a-19b6-4fe5-9507-d50dd25053a3","Type":"ContainerStarted","Data":"52d1debaa6b0818bfbd0631cbd93e839ddb52e955afa2330fdcfa4b4624e1fe3"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.490640 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.495505 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" event={"ID":"8a138d52-46a4-411d-a02e-8813b29d0ff5","Type":"ContainerStarted","Data":"a58eefaf78855c7b65ee540a0d0b40772f9c4a874b47063f2c9b88f8b7ccda9d"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.496077 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.497607 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g2wtm" event={"ID":"d26efdd9-e946-418f-95a6-0100f0364b92","Type":"ContainerStarted","Data":"8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.497994 4892 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6xp7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.498035 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" podUID="8a138d52-46a4-411d-a02e-8813b29d0ff5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.499199 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" event={"ID":"d484383e-4760-4087-847f-644b007f5656","Type":"ContainerStarted","Data":"c1cc043f6859a9db638a94b87027892d096cd758f75aa8afb98bd55b043cfe9a"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.499262 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" event={"ID":"d484383e-4760-4087-847f-644b007f5656","Type":"ContainerStarted","Data":"952ddbc9dc050c13b4425db2ef58ae147bfeb8d1b4c43a45f632c3c97c719f15"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.508883 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" event={"ID":"5237ad21-39ce-43b8-a1da-bb226f255b13","Type":"ContainerStarted","Data":"78599e8c32c18114273b0b6f9cf8ea2e55d143831217b1c9567e9038c0fe9af4"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.513492 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" event={"ID":"e0abff23-83fc-40fc-ba54-55ac7cc6c5cf","Type":"ContainerStarted","Data":"ab0b22940031d5bf658aacb0062605d750f023a5410660de57ae951e385c9b1e"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.515065 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" event={"ID":"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541","Type":"ContainerStarted","Data":"a72b109fd786e3ddafb4c967c8734606d6c0fb66a367d16cddccba7b4069b055"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.515086 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" event={"ID":"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541","Type":"ContainerStarted","Data":"887152581e45fa4762b38ffb2d7d7aa2f39be58770e02b2a9e2b87206c84e6c1"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.519399 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.519588 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.019560839 +0000 UTC m=+146.569266604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.519711 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.520048 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.020034498 +0000 UTC m=+146.569740263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.521717 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" event={"ID":"c2de137a-5cfb-4e83-bc12-a51456830ecb","Type":"ContainerStarted","Data":"a82a25b48321e05321b15b98457a1c39095782b6b22243f37e168b208713abcb"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.521758 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" event={"ID":"c2de137a-5cfb-4e83-bc12-a51456830ecb","Type":"ContainerStarted","Data":"df1997660c0d6f332ff3656f86707ce9086883fa19775295456699a4ce4e9ac5"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.524958 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" event={"ID":"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97","Type":"ContainerStarted","Data":"d28db0321f107af9435028c1f2058f97d486cd8984558070e667481d94413332"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.524998 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" event={"ID":"2bbf72a0-ca1f-43e9-a0eb-56bed4086b97","Type":"ContainerStarted","Data":"47f541218756ad1431068d6688687eada84bff28d51db913f2a1838c7c7790fc"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.526604 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" event={"ID":"e0484ee5-9669-4442-ad6b-90cc850c81ea","Type":"ContainerStarted","Data":"77d2cced6ec325dfe74d089db0f8a7aa88aed1e5d734cfab93c1773866185991"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.527594 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.530463 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" event={"ID":"ecf3538a-312a-4485-933b-021f39fb9281","Type":"ContainerStarted","Data":"5dcd5ea45441ec3101abcad2bae00a10d2241e2db9d96070e41a38a4b3b3682a"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.530491 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" event={"ID":"ecf3538a-312a-4485-933b-021f39fb9281","Type":"ContainerStarted","Data":"a24acfea3bbf14456186ccb1bcb6f2f61ff84ada4a8c9344c1cf9858c71c44be"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.531889 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" event={"ID":"aac1222e-f92a-4345-8ca2-125d2d2c2627","Type":"ContainerStarted","Data":"6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.533032 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.536314 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" event={"ID":"81b7b337-cd16-4231-bf36-f505d7d9afb5","Type":"ContainerStarted","Data":"9a083fca8bfaac4cf67b2d42a1947504d5c67b49bd239afcdb5b1cfd6abb1eab"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.538984 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" event={"ID":"e5f891dd-b8d4-4ecc-940b-f41218193a8b","Type":"ContainerStarted","Data":"007db4ec9c8369a515c04863d5ccc96c0ce6510277f502e9248d7a710e1ca741"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.540393 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" event={"ID":"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1","Type":"ContainerStarted","Data":"bf8898d4b4dd78f31698b173bd5c4fea661e464a97dd407490e0d48e58ee6130"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.540424 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" event={"ID":"3133ba3d-4738-4ae2-aa39-f651cd3d3bd1","Type":"ContainerStarted","Data":"dea2e1731529fb20190a2fd8b4b4f05ad45e48b163ba61a0767c9c4164cbdd51"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.542075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" event={"ID":"e7208801-52e6-44b2-8fe8-508fcdcfc9c0","Type":"ContainerStarted","Data":"6d11544da74dc938ef11cff6d20655ce44e2b0e8f2f6f3166926bb8624cc58c2"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.543689 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" event={"ID":"dd66776b-8020-4b60-b147-fb605daea344","Type":"ContainerStarted","Data":"392f73a4abbaf16d7f08ab67b2dbe6e2d07fd4e0e22d4aae5fec5217568daf8e"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.543725 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" event={"ID":"dd66776b-8020-4b60-b147-fb605daea344","Type":"ContainerStarted","Data":"d2dc21e559c7c7a5a598bfdf38e5a134f43ea2cf0f8b57ec577a09483a814767"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.545297 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" podStartSLOduration=124.545283505 podStartE2EDuration="2m4.545283505s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.5285956 +0000 UTC m=+146.078301365" watchObservedRunningTime="2025-10-06 12:10:59.545283505 +0000 UTC m=+146.094989290" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.545458 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ghtz9" event={"ID":"54de0a22-2d75-4482-a8f4-b71063b9e356","Type":"ContainerStarted","Data":"13a5821799b3fee69a2043b035f1ba7050ef1281542ed056a1fbba9e569e4af6"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.545965 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7gwd5" podStartSLOduration=124.545959273 podStartE2EDuration="2m4.545959273s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.544736923 +0000 UTC m=+146.094442688" watchObservedRunningTime="2025-10-06 12:10:59.545959273 +0000 UTC m=+146.095665038" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.547197 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" event={"ID":"eb1aaaad-b069-4b70-87aa-51beaea830b3","Type":"ContainerStarted","Data":"7973064407a8d7e9e37a7d0093fca23251a7fb9a743d2c4fd9d640ae1e09d7e3"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.547224 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" event={"ID":"eb1aaaad-b069-4b70-87aa-51beaea830b3","Type":"ContainerStarted","Data":"dd61208b9ae055bade3b44819e6ec3714d98f2543cbb1176e5f9a75fd75523a5"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.548553 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" event={"ID":"5695d15c-0766-463e-8111-9bc66db72e77","Type":"ContainerStarted","Data":"8c6f4739db1660fa36d94a016cc03052c889330b84fbc135e8251bf1d4a615e7"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.550810 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" event={"ID":"099dffdf-1bf9-451d-8248-d4104dcdf1b6","Type":"ContainerStarted","Data":"00aab8ea4f3426886bea2f45f7464ebf4c28c14490e01d15bab8f56efc508fd3"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.568491 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" podStartSLOduration=124.568472158 podStartE2EDuration="2m4.568472158s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.566177614 +0000 UTC m=+146.115883379" watchObservedRunningTime="2025-10-06 12:10:59.568472158 +0000 UTC m=+146.118177923" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.572047 4892 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jkw5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.572053 4892 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gj4rs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.572103 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" podUID="e0484ee5-9669-4442-ad6b-90cc850c81ea" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.572142 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" podUID="aac1222e-f92a-4345-8ca2-125d2d2c2627" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.572627 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hfjg7" event={"ID":"474567d5-fe59-4883-a548-77a0217b490d","Type":"ContainerStarted","Data":"b906415b38d07dac43f67c2677f2503e8261da88a9c9b691cd7bad90b769c036"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.574908 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" event={"ID":"caa02b36-1e64-4845-ac7a-1d9ba48b1d18","Type":"ContainerStarted","Data":"256a50314221b7f9c13a02083d992de066fe196145bb1ce84b514d83477c53c7"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.574950 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" event={"ID":"caa02b36-1e64-4845-ac7a-1d9ba48b1d18","Type":"ContainerStarted","Data":"30198dc159f984cf97330d32d533c00bca80e79a064a6ccc22c95db9c5829fe5"} Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.577927 4892 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-h9q95 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.577974 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" podUID="d6cd9565-520a-47d6-bb93-7423147863ef" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.578901 4892 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-cqq89 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.578959 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" podUID="3d64b83b-4248-4a44-ada4-8fd3439d4d54" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.592551 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" podStartSLOduration=124.592531076 podStartE2EDuration="2m4.592531076s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.580015472 +0000 UTC m=+146.129721237" watchObservedRunningTime="2025-10-06 12:10:59.592531076 +0000 UTC m=+146.142236841" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.592664 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gf8hm" podStartSLOduration=124.592659891 podStartE2EDuration="2m4.592659891s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.590309425 +0000 UTC m=+146.140015190" watchObservedRunningTime="2025-10-06 12:10:59.592659891 +0000 UTC m=+146.142365656" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.618824 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-g2wtm" podStartSLOduration=124.618805505 podStartE2EDuration="2m4.618805505s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.616775722 +0000 UTC m=+146.166481497" watchObservedRunningTime="2025-10-06 12:10:59.618805505 +0000 UTC m=+146.168511280" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.621796 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.621932 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.121916513 +0000 UTC m=+146.671622288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.622296 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.624562 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.124548341 +0000 UTC m=+146.674254106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.659166 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-46gnn" podStartSLOduration=124.659149652 podStartE2EDuration="2m4.659149652s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.635426118 +0000 UTC m=+146.185131883" watchObservedRunningTime="2025-10-06 12:10:59.659149652 +0000 UTC m=+146.208855417" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.660243 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8nv6b" podStartSLOduration=124.660236977 podStartE2EDuration="2m4.660236977s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.657603659 +0000 UTC m=+146.207309434" watchObservedRunningTime="2025-10-06 12:10:59.660236977 +0000 UTC m=+146.209942742" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.702065 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-z9b97" podStartSLOduration=124.702044914 podStartE2EDuration="2m4.702044914s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.689916306 +0000 UTC m=+146.239622071" watchObservedRunningTime="2025-10-06 12:10:59.702044914 +0000 UTC m=+146.251750679" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.702182 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ghtz9" podStartSLOduration=6.702171439 podStartE2EDuration="6.702171439s" podCreationTimestamp="2025-10-06 12:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.673096175 +0000 UTC m=+146.222801940" watchObservedRunningTime="2025-10-06 12:10:59.702171439 +0000 UTC m=+146.251877204" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.723372 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9zbfw" podStartSLOduration=124.723352099 podStartE2EDuration="2m4.723352099s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.719747981 +0000 UTC m=+146.269453746" watchObservedRunningTime="2025-10-06 12:10:59.723352099 +0000 UTC m=+146.273057864" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.724459 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.724785 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.224771128 +0000 UTC m=+146.774476893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.736022 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" podStartSLOduration=124.736008319 podStartE2EDuration="2m4.736008319s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.73553846 +0000 UTC m=+146.285244225" watchObservedRunningTime="2025-10-06 12:10:59.736008319 +0000 UTC m=+146.285714084" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.754796 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-f5gdg" podStartSLOduration=124.75478238 podStartE2EDuration="2m4.75478238s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.752915604 +0000 UTC m=+146.302621369" watchObservedRunningTime="2025-10-06 12:10:59.75478238 +0000 UTC m=+146.304488145" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.807055 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-knk7q" podStartSLOduration=124.807039007 podStartE2EDuration="2m4.807039007s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.775370806 +0000 UTC m=+146.325076571" watchObservedRunningTime="2025-10-06 12:10:59.807039007 +0000 UTC m=+146.356744772" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.808253 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-sl4cl" podStartSLOduration=124.808249647 podStartE2EDuration="2m4.808249647s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.807268606 +0000 UTC m=+146.356974371" watchObservedRunningTime="2025-10-06 12:10:59.808249647 +0000 UTC m=+146.357955412" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.829004 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.829292 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.329280621 +0000 UTC m=+146.878986386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.849201 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4bs2h" podStartSLOduration=124.849188468 podStartE2EDuration="2m4.849188468s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.844439853 +0000 UTC m=+146.394145618" watchObservedRunningTime="2025-10-06 12:10:59.849188468 +0000 UTC m=+146.398894233" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.872373 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kgnbr" podStartSLOduration=124.87235597 podStartE2EDuration="2m4.87235597s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:10:59.870942362 +0000 UTC m=+146.420648127" watchObservedRunningTime="2025-10-06 12:10:59.87235597 +0000 UTC m=+146.422061735" Oct 06 12:10:59 crc kubenswrapper[4892]: I1006 12:10:59.930035 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:10:59 crc kubenswrapper[4892]: E1006 12:10:59.930308 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.43029471 +0000 UTC m=+146.980000475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.031355 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.031735 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.531722406 +0000 UTC m=+147.081428171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.132038 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.132488 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.632471824 +0000 UTC m=+147.182177589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.233653 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.234122 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.734102539 +0000 UTC m=+147.283808404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.279918 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.279992 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.334590 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.334750 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.834730022 +0000 UTC m=+147.384435797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.335201 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.335571 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.835559906 +0000 UTC m=+147.385265681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.436801 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.437015 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.936984782 +0000 UTC m=+147.486690557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.437162 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.437492 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:00.937478513 +0000 UTC m=+147.487184278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.538076 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.538211 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.038183919 +0000 UTC m=+147.587889694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.538446 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.538768 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.038756873 +0000 UTC m=+147.588462708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.580037 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" event={"ID":"e5f891dd-b8d4-4ecc-940b-f41218193a8b","Type":"ContainerStarted","Data":"4f464d813e2e7af3f9ef7d945e7c948f409f20c425603880c248733d456ba1a1"} Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.581664 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hfjg7" event={"ID":"474567d5-fe59-4883-a548-77a0217b490d","Type":"ContainerStarted","Data":"c5ed6790bfea39909b88ed64595a9256004eb7202c2081468afaa74119413ca3"} Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.581712 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hfjg7" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.583310 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" event={"ID":"bcbe3ede-971b-4d1a-8a9c-fc9a6185e541","Type":"ContainerStarted","Data":"ff3e6232da7d2ab27d081d35fa4bc2141714fd3cd4a4174942a9b4bb6b56a57c"} Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.583960 4892 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-gj4rs container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.583974 4892 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-cqq89 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.583998 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" podUID="aac1222e-f92a-4345-8ca2-125d2d2c2627" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.21:8080/healthz\": dial tcp 10.217.0.21:8080: connect: connection refused" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.583965 4892 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jkw5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.584076 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" podUID="e0484ee5-9669-4442-ad6b-90cc850c81ea" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.584018 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" podUID="3d64b83b-4248-4a44-ada4-8fd3439d4d54" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.584165 4892 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-h9q95 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.584195 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" podUID="d6cd9565-520a-47d6-bb93-7423147863ef" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.584291 4892 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6xp7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.584335 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" podUID="8a138d52-46a4-411d-a02e-8813b29d0ff5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.613920 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" podStartSLOduration=125.613899799 podStartE2EDuration="2m5.613899799s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:00.611275861 +0000 UTC m=+147.160981616" watchObservedRunningTime="2025-10-06 12:11:00.613899799 +0000 UTC m=+147.163605574" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.633114 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hfjg7" podStartSLOduration=7.633100528 podStartE2EDuration="7.633100528s" podCreationTimestamp="2025-10-06 12:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:00.631730302 +0000 UTC m=+147.181436087" watchObservedRunningTime="2025-10-06 12:11:00.633100528 +0000 UTC m=+147.182806293" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.639670 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.639882 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.139853455 +0000 UTC m=+147.689559230 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.640789 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.647112 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.147096663 +0000 UTC m=+147.696802418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.662158 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jwj8k" podStartSLOduration=125.662142731 podStartE2EDuration="2m5.662142731s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:00.65969554 +0000 UTC m=+147.209401305" watchObservedRunningTime="2025-10-06 12:11:00.662142731 +0000 UTC m=+147.211848496" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.664933 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.665244 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.666682 4892 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cxk5j container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.666731 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" podUID="e5f891dd-b8d4-4ecc-940b-f41218193a8b" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.687628 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8q55t" podStartSLOduration=125.687611277 podStartE2EDuration="2m5.687611277s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:00.68524929 +0000 UTC m=+147.234955055" watchObservedRunningTime="2025-10-06 12:11:00.687611277 +0000 UTC m=+147.237317042" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.706932 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-9cntp" podStartSLOduration=125.70691617 podStartE2EDuration="2m5.70691617s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:00.703361564 +0000 UTC m=+147.253067329" watchObservedRunningTime="2025-10-06 12:11:00.70691617 +0000 UTC m=+147.256621935" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.742201 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.742354 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.242333965 +0000 UTC m=+147.792039730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.742475 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.742767 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.242760212 +0000 UTC m=+147.792465977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.747380 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.748428 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.749879 4892 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-xfvct container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.749925 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" podUID="9b646909-419f-466d-84d8-0ccd08567b52" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.756591 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-66c2c" podStartSLOduration=125.75657343 podStartE2EDuration="2m5.75657343s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:00.754659411 +0000 UTC m=+147.304365166" watchObservedRunningTime="2025-10-06 12:11:00.75657343 +0000 UTC m=+147.306279195" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.757011 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" podStartSLOduration=125.757006237 podStartE2EDuration="2m5.757006237s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:00.740669316 +0000 UTC m=+147.290375081" watchObservedRunningTime="2025-10-06 12:11:00.757006237 +0000 UTC m=+147.306711992" Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.843402 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.843540 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.343516671 +0000 UTC m=+147.893222436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.844283 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.844556 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.344548323 +0000 UTC m=+147.894254088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.946072 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.946210 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.446190758 +0000 UTC m=+147.995896523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:00 crc kubenswrapper[4892]: I1006 12:11:00.946483 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:00 crc kubenswrapper[4892]: E1006 12:11:00.946799 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.446787353 +0000 UTC m=+147.996493118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.047295 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.047564 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.54751891 +0000 UTC m=+148.097224675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.047743 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.048096 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.548085544 +0000 UTC m=+148.097791399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.149130 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.149310 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.64928654 +0000 UTC m=+148.198992305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.149683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.149989 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.649980899 +0000 UTC m=+148.199686664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.251046 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.251256 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.751227078 +0000 UTC m=+148.300932853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.251407 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.251754 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.751745299 +0000 UTC m=+148.301451064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.280996 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:01 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:01 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:01 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.281062 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.351915 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.352125 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.852095291 +0000 UTC m=+148.401801056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.352279 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.352597 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.852584961 +0000 UTC m=+148.402290726 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.454288 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.454442 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.954419324 +0000 UTC m=+148.504125089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.454638 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.454925 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:01.954918244 +0000 UTC m=+148.504623999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.555675 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.555986 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.055971535 +0000 UTC m=+148.605677300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.589054 4892 generic.go:334] "Generic (PLEG): container finished" podID="27583c54-ec17-44a2-8240-224df02a4cbc" containerID="df7c3cdb8251a98ff34a4de6decf8fbe7f743e3388925f7c2f38736e85756ee8" exitCode=0 Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.589115 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" event={"ID":"27583c54-ec17-44a2-8240-224df02a4cbc","Type":"ContainerDied","Data":"df7c3cdb8251a98ff34a4de6decf8fbe7f743e3388925f7c2f38736e85756ee8"} Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.591063 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jdnln" event={"ID":"ac237b76-87ec-461c-a07d-b9b979f96d75","Type":"ContainerStarted","Data":"111cc2169b6b54f3851049be0f008afb69d154169d82ed6237272037a3e45153"} Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.593654 4892 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jkw5 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" start-of-body= Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.593695 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" podUID="e0484ee5-9669-4442-ad6b-90cc850c81ea" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.43:5443/healthz\": dial tcp 10.217.0.43:5443: connect: connection refused" Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.611757 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.656830 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.658293 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.158278828 +0000 UTC m=+148.707984583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.757563 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.757761 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.257735753 +0000 UTC m=+148.807441518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.757883 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.758172 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.25816067 +0000 UTC m=+148.807866435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.858631 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.858826 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.358802024 +0000 UTC m=+148.908507789 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.858911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.859232 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.359220311 +0000 UTC m=+148.908926066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.959828 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.960049 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.460016642 +0000 UTC m=+149.009722417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:01 crc kubenswrapper[4892]: I1006 12:11:01.960234 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:01 crc kubenswrapper[4892]: E1006 12:11:01.960569 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.460552844 +0000 UTC m=+149.010258609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.061502 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.061742 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.561710639 +0000 UTC m=+149.111416414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.061809 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.062170 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.562160677 +0000 UTC m=+149.111866542 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.162542 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.162677 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.662653085 +0000 UTC m=+149.212358850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.162954 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.162981 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.162997 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.163017 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.163405 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.663393416 +0000 UTC m=+149.213099181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.163714 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.164267 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.177282 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.177471 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.178050 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.264527 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.264876 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.764859383 +0000 UTC m=+149.314565148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.282572 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:02 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:02 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:02 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.282625 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.307919 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.366458 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.366905 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.866885894 +0000 UTC m=+149.416591749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.391139 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.403069 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.469001 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.469163 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.969132644 +0000 UTC m=+149.518838409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.469572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.469892 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:02.969881885 +0000 UTC m=+149.519587660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.570837 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.571189 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.071175404 +0000 UTC m=+149.620881169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.677577 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.679408 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.179391539 +0000 UTC m=+149.729097304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.779570 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.780076 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.280061065 +0000 UTC m=+149.829766820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: W1006 12:11:02.876616 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d3fb31c5b36fb66f1c9d1e4b38736711fd1d68f3829b8b7997b1f1a4e6b8482e WatchSource:0}: Error finding container d3fb31c5b36fb66f1c9d1e4b38736711fd1d68f3829b8b7997b1f1a4e6b8482e: Status 404 returned error can't find the container with id d3fb31c5b36fb66f1c9d1e4b38736711fd1d68f3829b8b7997b1f1a4e6b8482e Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.881566 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.381552143 +0000 UTC m=+149.931257908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.881714 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.983309 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.983431 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.483408337 +0000 UTC m=+150.033114092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:02 crc kubenswrapper[4892]: I1006 12:11:02.983770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:02 crc kubenswrapper[4892]: E1006 12:11:02.984053 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.484039843 +0000 UTC m=+150.033745608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.020185 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.084104 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27583c54-ec17-44a2-8240-224df02a4cbc-config-volume\") pod \"27583c54-ec17-44a2-8240-224df02a4cbc\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.084222 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.084255 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46ttv\" (UniqueName: \"kubernetes.io/projected/27583c54-ec17-44a2-8240-224df02a4cbc-kube-api-access-46ttv\") pod \"27583c54-ec17-44a2-8240-224df02a4cbc\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.084359 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27583c54-ec17-44a2-8240-224df02a4cbc-secret-volume\") pod \"27583c54-ec17-44a2-8240-224df02a4cbc\" (UID: \"27583c54-ec17-44a2-8240-224df02a4cbc\") " Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.085275 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.58525152 +0000 UTC m=+150.134957285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.085640 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27583c54-ec17-44a2-8240-224df02a4cbc-config-volume" (OuterVolumeSpecName: "config-volume") pod "27583c54-ec17-44a2-8240-224df02a4cbc" (UID: "27583c54-ec17-44a2-8240-224df02a4cbc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.088555 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27583c54-ec17-44a2-8240-224df02a4cbc-kube-api-access-46ttv" (OuterVolumeSpecName: "kube-api-access-46ttv") pod "27583c54-ec17-44a2-8240-224df02a4cbc" (UID: "27583c54-ec17-44a2-8240-224df02a4cbc"). InnerVolumeSpecName "kube-api-access-46ttv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.091420 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27583c54-ec17-44a2-8240-224df02a4cbc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "27583c54-ec17-44a2-8240-224df02a4cbc" (UID: "27583c54-ec17-44a2-8240-224df02a4cbc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:11:03 crc kubenswrapper[4892]: W1006 12:11:03.135939 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-a18a240c25068f125bf22524c026bcc2491a495ff904bdc94385c1aed12eae10 WatchSource:0}: Error finding container a18a240c25068f125bf22524c026bcc2491a495ff904bdc94385c1aed12eae10: Status 404 returned error can't find the container with id a18a240c25068f125bf22524c026bcc2491a495ff904bdc94385c1aed12eae10 Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.185741 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.185835 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27583c54-ec17-44a2-8240-224df02a4cbc-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.185846 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27583c54-ec17-44a2-8240-224df02a4cbc-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.185854 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46ttv\" (UniqueName: \"kubernetes.io/projected/27583c54-ec17-44a2-8240-224df02a4cbc-kube-api-access-46ttv\") on node \"crc\" DevicePath \"\"" Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.186080 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.686066001 +0000 UTC m=+150.235771766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.281180 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:03 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:03 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:03 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.281384 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.286285 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.286490 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.786463375 +0000 UTC m=+150.336169140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.286543 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.286847 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.786836201 +0000 UTC m=+150.336541966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.358153 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2zd5m"] Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.358373 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27583c54-ec17-44a2-8240-224df02a4cbc" containerName="collect-profiles" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.358385 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="27583c54-ec17-44a2-8240-224df02a4cbc" containerName="collect-profiles" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.358468 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="27583c54-ec17-44a2-8240-224df02a4cbc" containerName="collect-profiles" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.359128 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.361158 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.370771 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2zd5m"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.387173 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.387368 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.887340309 +0000 UTC m=+150.437046074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.387656 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.387932 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.887926023 +0000 UTC m=+150.437631788 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.488825 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.489016 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.988995554 +0000 UTC m=+150.538701319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.489122 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsktw\" (UniqueName: \"kubernetes.io/projected/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-kube-api-access-lsktw\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.489155 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-catalog-content\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.489188 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-utilities\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.489378 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.489706 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:03.989694253 +0000 UTC m=+150.539400018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.539606 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.540219 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.544978 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.546802 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.566283 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-44kmf"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.567168 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.568839 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.583028 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44kmf"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.589815 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.590095 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.590248 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.090225813 +0000 UTC m=+150.639931578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.590275 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsktw\" (UniqueName: \"kubernetes.io/projected/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-kube-api-access-lsktw\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.590301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-catalog-content\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.590323 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.590354 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-utilities\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.590395 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.590427 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.590820 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-catalog-content\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.590883 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.090875569 +0000 UTC m=+150.640581334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.591275 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-utilities\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.613092 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4dcff58025fcde33ffefde57c908de8d10c62e4eaafd5285b3e18648ee2d7309"} Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.613136 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a18a240c25068f125bf22524c026bcc2491a495ff904bdc94385c1aed12eae10"} Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.618049 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsktw\" (UniqueName: \"kubernetes.io/projected/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-kube-api-access-lsktw\") pod \"community-operators-2zd5m\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.619564 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"96d2196f904790a30daee4d5d082c5a32f44d878e5781bd9e8ec4b57083a542b"} Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.619718 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8d47387e6feb2fedfac7f7c435f1f5cf6863d0d48edbc5902ee39e09aef01ebb"} Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.623383 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" event={"ID":"27583c54-ec17-44a2-8240-224df02a4cbc","Type":"ContainerDied","Data":"be34ce40aa3212494341a3ac8983a2089ad3681d081b23c5476911ea768e0b70"} Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.623421 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be34ce40aa3212494341a3ac8983a2089ad3681d081b23c5476911ea768e0b70" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.623589 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.635154 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6451385d1b783acdd49d1236585ca827f637c9fb8d0c8025a4cca98775e71ee9"} Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.635209 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d3fb31c5b36fb66f1c9d1e4b38736711fd1d68f3829b8b7997b1f1a4e6b8482e"} Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.635435 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.671288 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.694678 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.695090 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.695132 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-utilities\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.695208 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.695231 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jppnl\" (UniqueName: \"kubernetes.io/projected/84310366-fec4-4521-a296-7fdba4b65821-kube-api-access-jppnl\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.695255 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-catalog-content\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.695382 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.195368511 +0000 UTC m=+150.745074276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.695421 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.722894 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.765664 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mtgsp"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.766568 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.789674 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtgsp"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.796628 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.796683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jppnl\" (UniqueName: \"kubernetes.io/projected/84310366-fec4-4521-a296-7fdba4b65821-kube-api-access-jppnl\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.796729 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-catalog-content\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.796781 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-utilities\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.797132 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-utilities\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.797344 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-catalog-content\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.797474 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.297460295 +0000 UTC m=+150.847166060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.818953 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jppnl\" (UniqueName: \"kubernetes.io/projected/84310366-fec4-4521-a296-7fdba4b65821-kube-api-access-jppnl\") pod \"certified-operators-44kmf\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.854817 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.897022 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.899718 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.899908 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-utilities\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.899993 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jfd\" (UniqueName: \"kubernetes.io/projected/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-kube-api-access-m2jfd\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.900024 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-catalog-content\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:03 crc kubenswrapper[4892]: E1006 12:11:03.900135 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.400120562 +0000 UTC m=+150.949826327 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.914857 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2zd5m"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.950697 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-swbp7"] Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.952044 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:03 crc kubenswrapper[4892]: I1006 12:11:03.969163 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swbp7"] Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.001806 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smnz9\" (UniqueName: \"kubernetes.io/projected/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-kube-api-access-smnz9\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.001866 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2jfd\" (UniqueName: \"kubernetes.io/projected/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-kube-api-access-m2jfd\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.001884 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-utilities\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.001909 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-catalog-content\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.001930 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-catalog-content\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.001953 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-utilities\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.001998 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.002686 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.502661454 +0000 UTC m=+151.052367219 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.002705 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-catalog-content\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.002903 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-utilities\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.032228 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2jfd\" (UniqueName: \"kubernetes.io/projected/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-kube-api-access-m2jfd\") pod \"community-operators-mtgsp\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.083406 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.103048 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.105625 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smnz9\" (UniqueName: \"kubernetes.io/projected/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-kube-api-access-smnz9\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.105670 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-utilities\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.105711 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-catalog-content\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.106154 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-catalog-content\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.106229 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.606211977 +0000 UTC m=+151.155917742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.107207 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-utilities\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.110101 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.126341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smnz9\" (UniqueName: \"kubernetes.io/projected/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-kube-api-access-smnz9\") pod \"certified-operators-swbp7\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.207202 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.207783 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.707772589 +0000 UTC m=+151.257478354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.281510 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:04 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:04 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:04 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.281557 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.308095 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.311265 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.811242909 +0000 UTC m=+151.360948674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.327583 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.395208 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44kmf"] Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.399600 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mtgsp"] Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.409340 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.409647 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:04.90963556 +0000 UTC m=+151.459341325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: W1006 12:11:04.429946 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfaba287_8034_4e1b_a4cd_f6c7962f9d45.slice/crio-0351ed4c868e9fe4e97cffe0a525f2c737da2424fcbcdde3337d214d734a6a63 WatchSource:0}: Error finding container 0351ed4c868e9fe4e97cffe0a525f2c737da2424fcbcdde3337d214d734a6a63: Status 404 returned error can't find the container with id 0351ed4c868e9fe4e97cffe0a525f2c737da2424fcbcdde3337d214d734a6a63 Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.510275 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.510475 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.010446271 +0000 UTC m=+151.560152026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.510514 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.510853 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.010845997 +0000 UTC m=+151.560551752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.560694 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swbp7"] Oct 06 12:11:04 crc kubenswrapper[4892]: W1006 12:11:04.580407 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e19c6b4_23cb_4864_9470_ef8acaa1f5fc.slice/crio-7a988d017863fbf170c9b2e64a45fcc6e688c1fced20b9132f86c63120fba0f5 WatchSource:0}: Error finding container 7a988d017863fbf170c9b2e64a45fcc6e688c1fced20b9132f86c63120fba0f5: Status 404 returned error can't find the container with id 7a988d017863fbf170c9b2e64a45fcc6e688c1fced20b9132f86c63120fba0f5 Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.611766 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.611892 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.111875797 +0000 UTC m=+151.661581562 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.611965 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.612229 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.112221521 +0000 UTC m=+151.661927286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.640829 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtgsp" event={"ID":"bfaba287-8034-4e1b-a4cd-f6c7962f9d45","Type":"ContainerStarted","Data":"0351ed4c868e9fe4e97cffe0a525f2c737da2424fcbcdde3337d214d734a6a63"} Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.642182 4892 generic.go:334] "Generic (PLEG): container finished" podID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerID="ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b" exitCode=0 Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.642273 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zd5m" event={"ID":"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac","Type":"ContainerDied","Data":"ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b"} Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.642460 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zd5m" event={"ID":"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac","Type":"ContainerStarted","Data":"cec97df218026ff0993cd7038867162ee0f9559cad04e0ca336961f84baf0511"} Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.644945 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.648809 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5ac2118d-96d5-49c6-b76b-7fde57eaa826","Type":"ContainerStarted","Data":"db20a392188418e577f82cf22e68fe900e87c9dc6090f3d9de1066b8d7b0a5af"} Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.648845 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5ac2118d-96d5-49c6-b76b-7fde57eaa826","Type":"ContainerStarted","Data":"32b980d5800e730e42de8a8d9d4acb26cddb477db7a3d2f335d22bde4ba20e8d"} Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.649988 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swbp7" event={"ID":"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc","Type":"ContainerStarted","Data":"7a988d017863fbf170c9b2e64a45fcc6e688c1fced20b9132f86c63120fba0f5"} Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.650880 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44kmf" event={"ID":"84310366-fec4-4521-a296-7fdba4b65821","Type":"ContainerStarted","Data":"a331f0fd0480227b56592efd79d6fbf07421d926d8cedb9fd0fa7f67872ce43e"} Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.713140 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.714283 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.214250052 +0000 UTC m=+151.763955817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.779179 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-f59vt" Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.814958 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.815237 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.31522369 +0000 UTC m=+151.864929455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.915804 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.915966 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.415945107 +0000 UTC m=+151.965650872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:04 crc kubenswrapper[4892]: I1006 12:11:04.916164 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:04 crc kubenswrapper[4892]: E1006 12:11:04.916489 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.416478669 +0000 UTC m=+151.966184434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.017558 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.017729 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.517705137 +0000 UTC m=+152.067410902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.018121 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.018452 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.518444637 +0000 UTC m=+152.068150402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.119719 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.120043 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.619999179 +0000 UTC m=+152.169705014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.120365 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.120899 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.620881565 +0000 UTC m=+152.170587340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.221435 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.221988 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.721957417 +0000 UTC m=+152.271663192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.280500 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:05 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:05 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:05 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.280572 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.322925 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.323397 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.823371122 +0000 UTC m=+152.373076937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.424089 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.424693 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.924652103 +0000 UTC m=+152.474357888 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.424799 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.425206 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:05.925191085 +0000 UTC m=+152.474896850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.526799 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.527075 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.027044229 +0000 UTC m=+152.576749994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.527188 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.527690 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.027670264 +0000 UTC m=+152.577376039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.558064 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cf27c"] Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.560597 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.563434 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.572479 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf27c"] Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.627946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.628110 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.128089109 +0000 UTC m=+152.677794874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.628215 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.628280 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwsl\" (UniqueName: \"kubernetes.io/projected/8e102df5-64d4-4682-9cb1-22a7165f4294-kube-api-access-8nwsl\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.628393 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-utilities\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.628440 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-catalog-content\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.628555 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.128547798 +0000 UTC m=+152.678253563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.656672 4892 generic.go:334] "Generic (PLEG): container finished" podID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerID="cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b" exitCode=0 Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.656752 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtgsp" event={"ID":"bfaba287-8034-4e1b-a4cd-f6c7962f9d45","Type":"ContainerDied","Data":"cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b"} Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.658290 4892 generic.go:334] "Generic (PLEG): container finished" podID="5ac2118d-96d5-49c6-b76b-7fde57eaa826" containerID="db20a392188418e577f82cf22e68fe900e87c9dc6090f3d9de1066b8d7b0a5af" exitCode=0 Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.658381 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5ac2118d-96d5-49c6-b76b-7fde57eaa826","Type":"ContainerDied","Data":"db20a392188418e577f82cf22e68fe900e87c9dc6090f3d9de1066b8d7b0a5af"} Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.660155 4892 generic.go:334] "Generic (PLEG): container finished" podID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerID="8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050" exitCode=0 Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.660190 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swbp7" event={"ID":"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc","Type":"ContainerDied","Data":"8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050"} Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.661602 4892 generic.go:334] "Generic (PLEG): container finished" podID="84310366-fec4-4521-a296-7fdba4b65821" containerID="b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108" exitCode=0 Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.661629 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44kmf" event={"ID":"84310366-fec4-4521-a296-7fdba4b65821","Type":"ContainerDied","Data":"b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108"} Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.729155 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.729511 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-utilities\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.729553 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-catalog-content\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.729665 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwsl\" (UniqueName: \"kubernetes.io/projected/8e102df5-64d4-4682-9cb1-22a7165f4294-kube-api-access-8nwsl\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.730167 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.230141011 +0000 UTC m=+152.779846766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.730626 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-catalog-content\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.730844 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-utilities\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.755117 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.760757 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwsl\" (UniqueName: \"kubernetes.io/projected/8e102df5-64d4-4682-9cb1-22a7165f4294-kube-api-access-8nwsl\") pod \"redhat-marketplace-cf27c\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.763637 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-xfvct" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.779740 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-qhnln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.779792 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qhnln" podUID="b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.779810 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-qhnln container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.779863 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qhnln" podUID="b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.838671 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.840061 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.340046065 +0000 UTC m=+152.889751820 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.879387 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.940752 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.940915 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.440891137 +0000 UTC m=+152.990596902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.941504 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:05 crc kubenswrapper[4892]: E1006 12:11:05.942078 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.442063416 +0000 UTC m=+152.991769181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.948884 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cdlc5"] Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.949774 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.954418 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-cqq89" Oct 06 12:11:05 crc kubenswrapper[4892]: I1006 12:11:05.962481 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdlc5"] Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.043743 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:06 crc kubenswrapper[4892]: E1006 12:11:06.043971 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.54394616 +0000 UTC m=+153.093651925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.044148 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-catalog-content\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.044210 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2r6g\" (UniqueName: \"kubernetes.io/projected/93d5a0ee-784b-45f0-bdad-5df1f824f031-kube-api-access-k2r6g\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.044258 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.044295 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-utilities\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: E1006 12:11:06.045225 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.545213433 +0000 UTC m=+153.094919198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.045606 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.145802 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.146143 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-utilities\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.146278 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-catalog-content\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.146393 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2r6g\" (UniqueName: \"kubernetes.io/projected/93d5a0ee-784b-45f0-bdad-5df1f824f031-kube-api-access-k2r6g\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: E1006 12:11:06.147212 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.647197821 +0000 UTC m=+153.196903586 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.148299 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-utilities\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.150471 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-catalog-content\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.183230 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2r6g\" (UniqueName: \"kubernetes.io/projected/93d5a0ee-784b-45f0-bdad-5df1f824f031-kube-api-access-k2r6g\") pod \"redhat-marketplace-cdlc5\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.190184 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf27c"] Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.247995 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:06 crc kubenswrapper[4892]: E1006 12:11:06.248343 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-06 12:11:06.748318864 +0000 UTC m=+153.298024629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncchp" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.271290 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.291440 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.297309 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:06 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:06 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:06 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.297380 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.298733 4892 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.341221 4892 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-06T12:11:06.298756276Z","Handler":null,"Name":""} Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.345727 4892 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.345763 4892 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.350863 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.355458 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.385136 4892 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cxk5j container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]log ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]etcd ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/generic-apiserver-start-informers ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/max-in-flight-filter ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 06 12:11:06 crc kubenswrapper[4892]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 06 12:11:06 crc kubenswrapper[4892]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/project.openshift.io-projectcache ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/openshift.io-startinformers ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 06 12:11:06 crc kubenswrapper[4892]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 06 12:11:06 crc kubenswrapper[4892]: livez check failed Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.385186 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" podUID="e5f891dd-b8d4-4ecc-940b-f41218193a8b" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.451916 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.454845 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.454874 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.497645 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncchp\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.511735 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdlc5"] Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.523394 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.523628 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.527487 4892 patch_prober.go:28] interesting pod/console-f9d7485db-g2wtm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.527548 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g2wtm" podUID="d26efdd9-e946-418f-95a6-0100f0364b92" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 06 12:11:06 crc kubenswrapper[4892]: W1006 12:11:06.544437 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93d5a0ee_784b_45f0_bdad_5df1f824f031.slice/crio-996a9b8e103079f59ae0cff60076b212c577d873fcdd12c4f9cec431084312ed WatchSource:0}: Error finding container 996a9b8e103079f59ae0cff60076b212c577d873fcdd12c4f9cec431084312ed: Status 404 returned error can't find the container with id 996a9b8e103079f59ae0cff60076b212c577d873fcdd12c4f9cec431084312ed Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.548795 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8s2rt"] Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.549935 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.552189 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.568142 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8s2rt"] Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.611178 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.653898 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz56n\" (UniqueName: \"kubernetes.io/projected/27e60858-52de-4a1a-aa13-c3cd5b23747d-kube-api-access-kz56n\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.654018 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-catalog-content\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.654102 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-utilities\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.687606 4892 generic.go:334] "Generic (PLEG): container finished" podID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerID="c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458" exitCode=0 Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.687699 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf27c" event={"ID":"8e102df5-64d4-4682-9cb1-22a7165f4294","Type":"ContainerDied","Data":"c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458"} Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.687747 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf27c" event={"ID":"8e102df5-64d4-4682-9cb1-22a7165f4294","Type":"ContainerStarted","Data":"291b2ce22f180c439017d6fcc8a89d51bf5deae515db7895c8bcffab8382f139"} Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.693348 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdlc5" event={"ID":"93d5a0ee-784b-45f0-bdad-5df1f824f031","Type":"ContainerStarted","Data":"996a9b8e103079f59ae0cff60076b212c577d873fcdd12c4f9cec431084312ed"} Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.699923 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jdnln" event={"ID":"ac237b76-87ec-461c-a07d-b9b979f96d75","Type":"ContainerStarted","Data":"d30ceadccf4bca3d23381872a65b0d19725de7ff965a5eea125353c3b13f15e4"} Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.699968 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jdnln" event={"ID":"ac237b76-87ec-461c-a07d-b9b979f96d75","Type":"ContainerStarted","Data":"3f09caab13447bc1cc967034136e8cf8aa7564ce015ce531a75b845b2b91450f"} Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.755042 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-utilities\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.755468 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz56n\" (UniqueName: \"kubernetes.io/projected/27e60858-52de-4a1a-aa13-c3cd5b23747d-kube-api-access-kz56n\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.755594 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-catalog-content\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.755691 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-utilities\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.757294 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-catalog-content\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.779103 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz56n\" (UniqueName: \"kubernetes.io/projected/27e60858-52de-4a1a-aa13-c3cd5b23747d-kube-api-access-kz56n\") pod \"redhat-operators-8s2rt\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.785835 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.862599 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6xp7" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.873006 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.941374 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jkw5" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.951392 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-phfx6"] Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.952520 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.962113 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phfx6"] Oct 06 12:11:06 crc kubenswrapper[4892]: I1006 12:11:06.977312 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.058456 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kube-api-access\") pod \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\" (UID: \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\") " Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.058534 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kubelet-dir\") pod \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\" (UID: \"5ac2118d-96d5-49c6-b76b-7fde57eaa826\") " Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.058790 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljp5t\" (UniqueName: \"kubernetes.io/projected/64484fe6-cd8c-492d-9fd5-19dc11f559b8-kube-api-access-ljp5t\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.058782 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5ac2118d-96d5-49c6-b76b-7fde57eaa826" (UID: "5ac2118d-96d5-49c6-b76b-7fde57eaa826"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.064362 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-catalog-content\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.064677 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-utilities\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.064985 4892 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.065674 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5ac2118d-96d5-49c6-b76b-7fde57eaa826" (UID: "5ac2118d-96d5-49c6-b76b-7fde57eaa826"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.166128 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-utilities\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.166193 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljp5t\" (UniqueName: \"kubernetes.io/projected/64484fe6-cd8c-492d-9fd5-19dc11f559b8-kube-api-access-ljp5t\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.166258 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-catalog-content\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.166293 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5ac2118d-96d5-49c6-b76b-7fde57eaa826-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.166686 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-catalog-content\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.166883 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-utilities\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.183478 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljp5t\" (UniqueName: \"kubernetes.io/projected/64484fe6-cd8c-492d-9fd5-19dc11f559b8-kube-api-access-ljp5t\") pod \"redhat-operators-phfx6\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.269485 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.281000 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:07 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:07 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:07 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.281072 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.345044 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8s2rt"] Oct 06 12:11:07 crc kubenswrapper[4892]: W1006 12:11:07.364012 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27e60858_52de_4a1a_aa13_c3cd5b23747d.slice/crio-397937973ce46928b3a3997fbe66e84d36c1755d7df1b3cd24b1d4eda06603e8 WatchSource:0}: Error finding container 397937973ce46928b3a3997fbe66e84d36c1755d7df1b3cd24b1d4eda06603e8: Status 404 returned error can't find the container with id 397937973ce46928b3a3997fbe66e84d36c1755d7df1b3cd24b1d4eda06603e8 Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.407159 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncchp"] Oct 06 12:11:07 crc kubenswrapper[4892]: W1006 12:11:07.413749 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d15ec4b_09ec_427a_b002_a7293f363d8a.slice/crio-dac654b1a6e948fc45c1ad69427f3e42f6678cf0b0e75dace660fcf6c6a80145 WatchSource:0}: Error finding container dac654b1a6e948fc45c1ad69427f3e42f6678cf0b0e75dace660fcf6c6a80145: Status 404 returned error can't find the container with id dac654b1a6e948fc45c1ad69427f3e42f6678cf0b0e75dace660fcf6c6a80145 Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.683608 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 12:11:07 crc kubenswrapper[4892]: E1006 12:11:07.683862 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac2118d-96d5-49c6-b76b-7fde57eaa826" containerName="pruner" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.683873 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac2118d-96d5-49c6-b76b-7fde57eaa826" containerName="pruner" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.683984 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac2118d-96d5-49c6-b76b-7fde57eaa826" containerName="pruner" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.684386 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.692889 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.693027 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.693275 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.726972 4892 generic.go:334] "Generic (PLEG): container finished" podID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerID="b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c" exitCode=0 Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.727060 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdlc5" event={"ID":"93d5a0ee-784b-45f0-bdad-5df1f824f031","Type":"ContainerDied","Data":"b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c"} Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.738926 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-phfx6"] Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.740406 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jdnln" event={"ID":"ac237b76-87ec-461c-a07d-b9b979f96d75","Type":"ContainerStarted","Data":"2ad94a08d94555ffc1b02aebc348e856ef1910608b8c9cd28bef1622dc1a2123"} Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.749233 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5ac2118d-96d5-49c6-b76b-7fde57eaa826","Type":"ContainerDied","Data":"32b980d5800e730e42de8a8d9d4acb26cddb477db7a3d2f335d22bde4ba20e8d"} Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.749283 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32b980d5800e730e42de8a8d9d4acb26cddb477db7a3d2f335d22bde4ba20e8d" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.749290 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.751436 4892 generic.go:334] "Generic (PLEG): container finished" podID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerID="472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460" exitCode=0 Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.751511 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s2rt" event={"ID":"27e60858-52de-4a1a-aa13-c3cd5b23747d","Type":"ContainerDied","Data":"472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460"} Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.751538 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s2rt" event={"ID":"27e60858-52de-4a1a-aa13-c3cd5b23747d","Type":"ContainerStarted","Data":"397937973ce46928b3a3997fbe66e84d36c1755d7df1b3cd24b1d4eda06603e8"} Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.755433 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" event={"ID":"1d15ec4b-09ec-427a-b002-a7293f363d8a","Type":"ContainerStarted","Data":"3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542"} Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.755468 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" event={"ID":"1d15ec4b-09ec-427a-b002-a7293f363d8a","Type":"ContainerStarted","Data":"dac654b1a6e948fc45c1ad69427f3e42f6678cf0b0e75dace660fcf6c6a80145"} Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.755624 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.769645 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jdnln" podStartSLOduration=14.769626493 podStartE2EDuration="14.769626493s" podCreationTimestamp="2025-10-06 12:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:07.767999706 +0000 UTC m=+154.317705471" watchObservedRunningTime="2025-10-06 12:11:07.769626493 +0000 UTC m=+154.319332258" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.775871 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/824b68fe-6b22-44a5-98d1-1db5623d17c9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"824b68fe-6b22-44a5-98d1-1db5623d17c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.775972 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/824b68fe-6b22-44a5-98d1-1db5623d17c9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"824b68fe-6b22-44a5-98d1-1db5623d17c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.810473 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" podStartSLOduration=132.81045727 podStartE2EDuration="2m12.81045727s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:07.806868702 +0000 UTC m=+154.356574477" watchObservedRunningTime="2025-10-06 12:11:07.81045727 +0000 UTC m=+154.360163035" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.876958 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/824b68fe-6b22-44a5-98d1-1db5623d17c9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"824b68fe-6b22-44a5-98d1-1db5623d17c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.877076 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/824b68fe-6b22-44a5-98d1-1db5623d17c9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"824b68fe-6b22-44a5-98d1-1db5623d17c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.878656 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/824b68fe-6b22-44a5-98d1-1db5623d17c9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"824b68fe-6b22-44a5-98d1-1db5623d17c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:07 crc kubenswrapper[4892]: I1006 12:11:07.902830 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/824b68fe-6b22-44a5-98d1-1db5623d17c9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"824b68fe-6b22-44a5-98d1-1db5623d17c9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.010570 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.183626 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.293015 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:08 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:08 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:08 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.293095 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.295812 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.803992 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"824b68fe-6b22-44a5-98d1-1db5623d17c9","Type":"ContainerStarted","Data":"6c7e33161b8be9473c42de479729de5169a735d00c93e1029271e2f64c7d087e"} Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.804353 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"824b68fe-6b22-44a5-98d1-1db5623d17c9","Type":"ContainerStarted","Data":"6a3452c35740cce4c10bc8c3cb8d23b36caa7cb48f443fc25f312396cb3fe0de"} Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.808260 4892 generic.go:334] "Generic (PLEG): container finished" podID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerID="bcb5391038845af2610fa1a24986e76e359573c8e8eeb107144bb3718e84ab4d" exitCode=0 Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.809228 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfx6" event={"ID":"64484fe6-cd8c-492d-9fd5-19dc11f559b8","Type":"ContainerDied","Data":"bcb5391038845af2610fa1a24986e76e359573c8e8eeb107144bb3718e84ab4d"} Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.809250 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfx6" event={"ID":"64484fe6-cd8c-492d-9fd5-19dc11f559b8","Type":"ContainerStarted","Data":"fdb8d604fb779a3560cf10b28b84885be2469e7ee6f3017bb6ef6a860c679537"} Oct 06 12:11:08 crc kubenswrapper[4892]: I1006 12:11:08.952179 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hfjg7" Oct 06 12:11:09 crc kubenswrapper[4892]: I1006 12:11:09.280213 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:09 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:09 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:09 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:09 crc kubenswrapper[4892]: I1006 12:11:09.280276 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:09 crc kubenswrapper[4892]: I1006 12:11:09.818663 4892 generic.go:334] "Generic (PLEG): container finished" podID="824b68fe-6b22-44a5-98d1-1db5623d17c9" containerID="6c7e33161b8be9473c42de479729de5169a735d00c93e1029271e2f64c7d087e" exitCode=0 Oct 06 12:11:09 crc kubenswrapper[4892]: I1006 12:11:09.818704 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"824b68fe-6b22-44a5-98d1-1db5623d17c9","Type":"ContainerDied","Data":"6c7e33161b8be9473c42de479729de5169a735d00c93e1029271e2f64c7d087e"} Oct 06 12:11:10 crc kubenswrapper[4892]: I1006 12:11:10.280842 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:10 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:10 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:10 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:10 crc kubenswrapper[4892]: I1006 12:11:10.281131 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:10 crc kubenswrapper[4892]: I1006 12:11:10.668644 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:11:10 crc kubenswrapper[4892]: I1006 12:11:10.673231 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cxk5j" Oct 06 12:11:11 crc kubenswrapper[4892]: I1006 12:11:11.280781 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:11 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:11 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:11 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:11 crc kubenswrapper[4892]: I1006 12:11:11.281000 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:12 crc kubenswrapper[4892]: I1006 12:11:12.280105 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:12 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:12 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:12 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:12 crc kubenswrapper[4892]: I1006 12:11:12.280167 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:13 crc kubenswrapper[4892]: I1006 12:11:13.280239 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:13 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:13 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:13 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:13 crc kubenswrapper[4892]: I1006 12:11:13.280305 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.279931 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:14 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:14 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:14 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.280010 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.716766 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.800434 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/824b68fe-6b22-44a5-98d1-1db5623d17c9-kube-api-access\") pod \"824b68fe-6b22-44a5-98d1-1db5623d17c9\" (UID: \"824b68fe-6b22-44a5-98d1-1db5623d17c9\") " Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.801416 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/824b68fe-6b22-44a5-98d1-1db5623d17c9-kubelet-dir\") pod \"824b68fe-6b22-44a5-98d1-1db5623d17c9\" (UID: \"824b68fe-6b22-44a5-98d1-1db5623d17c9\") " Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.801459 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/824b68fe-6b22-44a5-98d1-1db5623d17c9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "824b68fe-6b22-44a5-98d1-1db5623d17c9" (UID: "824b68fe-6b22-44a5-98d1-1db5623d17c9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.801687 4892 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/824b68fe-6b22-44a5-98d1-1db5623d17c9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.805591 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824b68fe-6b22-44a5-98d1-1db5623d17c9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "824b68fe-6b22-44a5-98d1-1db5623d17c9" (UID: "824b68fe-6b22-44a5-98d1-1db5623d17c9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.871306 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"824b68fe-6b22-44a5-98d1-1db5623d17c9","Type":"ContainerDied","Data":"6a3452c35740cce4c10bc8c3cb8d23b36caa7cb48f443fc25f312396cb3fe0de"} Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.871383 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a3452c35740cce4c10bc8c3cb8d23b36caa7cb48f443fc25f312396cb3fe0de" Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.871437 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 06 12:11:14 crc kubenswrapper[4892]: I1006 12:11:14.902542 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/824b68fe-6b22-44a5-98d1-1db5623d17c9-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 06 12:11:15 crc kubenswrapper[4892]: I1006 12:11:15.280006 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:15 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:15 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:15 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:15 crc kubenswrapper[4892]: I1006 12:11:15.280070 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:15 crc kubenswrapper[4892]: I1006 12:11:15.779763 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-qhnln container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 06 12:11:15 crc kubenswrapper[4892]: I1006 12:11:15.779819 4892 patch_prober.go:28] interesting pod/downloads-7954f5f757-qhnln container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 06 12:11:15 crc kubenswrapper[4892]: I1006 12:11:15.779835 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qhnln" podUID="b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 06 12:11:15 crc kubenswrapper[4892]: I1006 12:11:15.779892 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qhnln" podUID="b2b7f5f5-c72f-4ffc-b2ab-b89aa8a9bbd2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.38:8080/\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 06 12:11:16 crc kubenswrapper[4892]: I1006 12:11:16.282839 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:16 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:16 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:16 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:16 crc kubenswrapper[4892]: I1006 12:11:16.282927 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:16 crc kubenswrapper[4892]: I1006 12:11:16.523458 4892 patch_prober.go:28] interesting pod/console-f9d7485db-g2wtm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Oct 06 12:11:16 crc kubenswrapper[4892]: I1006 12:11:16.523549 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g2wtm" podUID="d26efdd9-e946-418f-95a6-0100f0364b92" containerName="console" probeResult="failure" output="Get \"https://10.217.0.17:8443/health\": dial tcp 10.217.0.17:8443: connect: connection refused" Oct 06 12:11:17 crc kubenswrapper[4892]: I1006 12:11:17.028732 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:11:17 crc kubenswrapper[4892]: I1006 12:11:17.034813 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d042dea2-ba2d-4825-a01c-79d5eb2fc912-metrics-certs\") pod \"network-metrics-daemon-bf88v\" (UID: \"d042dea2-ba2d-4825-a01c-79d5eb2fc912\") " pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:11:17 crc kubenswrapper[4892]: I1006 12:11:17.281185 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:17 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:17 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:17 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:17 crc kubenswrapper[4892]: I1006 12:11:17.281266 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:17 crc kubenswrapper[4892]: I1006 12:11:17.295622 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bf88v" Oct 06 12:11:18 crc kubenswrapper[4892]: I1006 12:11:18.280285 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:18 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:18 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:18 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:18 crc kubenswrapper[4892]: I1006 12:11:18.280515 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:19 crc kubenswrapper[4892]: I1006 12:11:19.281531 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:19 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:19 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:19 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:19 crc kubenswrapper[4892]: I1006 12:11:19.281604 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:20 crc kubenswrapper[4892]: I1006 12:11:20.281503 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:20 crc kubenswrapper[4892]: [-]has-synced failed: reason withheld Oct 06 12:11:20 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:20 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:20 crc kubenswrapper[4892]: I1006 12:11:20.282032 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:21 crc kubenswrapper[4892]: I1006 12:11:21.280600 4892 patch_prober.go:28] interesting pod/router-default-5444994796-psw22 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 06 12:11:21 crc kubenswrapper[4892]: [+]has-synced ok Oct 06 12:11:21 crc kubenswrapper[4892]: [+]process-running ok Oct 06 12:11:21 crc kubenswrapper[4892]: healthz check failed Oct 06 12:11:21 crc kubenswrapper[4892]: I1006 12:11:21.280729 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-psw22" podUID="4c9c567b-051e-4d81-9f50-f575b43b3a04" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 06 12:11:22 crc kubenswrapper[4892]: I1006 12:11:22.281287 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:11:22 crc kubenswrapper[4892]: I1006 12:11:22.284114 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-psw22" Oct 06 12:11:22 crc kubenswrapper[4892]: I1006 12:11:22.986773 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:11:22 crc kubenswrapper[4892]: I1006 12:11:22.986887 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:11:25 crc kubenswrapper[4892]: I1006 12:11:25.785985 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qhnln" Oct 06 12:11:26 crc kubenswrapper[4892]: I1006 12:11:26.547236 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:11:26 crc kubenswrapper[4892]: I1006 12:11:26.551266 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:11:26 crc kubenswrapper[4892]: I1006 12:11:26.793850 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:11:36 crc kubenswrapper[4892]: I1006 12:11:36.614437 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xp47t" Oct 06 12:11:36 crc kubenswrapper[4892]: E1006 12:11:36.879074 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 06 12:11:36 crc kubenswrapper[4892]: E1006 12:11:36.879303 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m2jfd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mtgsp_openshift-marketplace(bfaba287-8034-4e1b-a4cd-f6c7962f9d45): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 12:11:36 crc kubenswrapper[4892]: E1006 12:11:36.880534 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mtgsp" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" Oct 06 12:11:37 crc kubenswrapper[4892]: E1006 12:11:37.927509 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mtgsp" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" Oct 06 12:11:38 crc kubenswrapper[4892]: E1006 12:11:38.001739 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 12:11:38 crc kubenswrapper[4892]: E1006 12:11:38.001911 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-smnz9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-swbp7_openshift-marketplace(9e19c6b4-23cb-4864-9470-ef8acaa1f5fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 12:11:38 crc kubenswrapper[4892]: E1006 12:11:38.003064 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-swbp7" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" Oct 06 12:11:38 crc kubenswrapper[4892]: E1006 12:11:38.009778 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 06 12:11:38 crc kubenswrapper[4892]: E1006 12:11:38.009946 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jppnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-44kmf_openshift-marketplace(84310366-fec4-4521-a296-7fdba4b65821): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 12:11:38 crc kubenswrapper[4892]: E1006 12:11:38.011147 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-44kmf" podUID="84310366-fec4-4521-a296-7fdba4b65821" Oct 06 12:11:40 crc kubenswrapper[4892]: E1006 12:11:40.143931 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-44kmf" podUID="84310366-fec4-4521-a296-7fdba4b65821" Oct 06 12:11:40 crc kubenswrapper[4892]: E1006 12:11:40.143950 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-swbp7" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" Oct 06 12:11:40 crc kubenswrapper[4892]: I1006 12:11:40.524161 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bf88v"] Oct 06 12:11:40 crc kubenswrapper[4892]: E1006 12:11:40.916571 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 12:11:40 crc kubenswrapper[4892]: E1006 12:11:40.916964 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nwsl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cf27c_openshift-marketplace(8e102df5-64d4-4682-9cb1-22a7165f4294): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 12:11:40 crc kubenswrapper[4892]: E1006 12:11:40.917552 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 06 12:11:40 crc kubenswrapper[4892]: E1006 12:11:40.917713 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k2r6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cdlc5_openshift-marketplace(93d5a0ee-784b-45f0-bdad-5df1f824f031): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 06 12:11:40 crc kubenswrapper[4892]: E1006 12:11:40.918274 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-cf27c" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" Oct 06 12:11:40 crc kubenswrapper[4892]: E1006 12:11:40.919903 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-cdlc5" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" Oct 06 12:11:41 crc kubenswrapper[4892]: I1006 12:11:41.022898 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfx6" event={"ID":"64484fe6-cd8c-492d-9fd5-19dc11f559b8","Type":"ContainerStarted","Data":"32b0b2782c392803265199cf991930fcf0b29093b1fcc3fb4aa31b800471923a"} Oct 06 12:11:41 crc kubenswrapper[4892]: I1006 12:11:41.028171 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s2rt" event={"ID":"27e60858-52de-4a1a-aa13-c3cd5b23747d","Type":"ContainerStarted","Data":"cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323"} Oct 06 12:11:41 crc kubenswrapper[4892]: I1006 12:11:41.031742 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bf88v" event={"ID":"d042dea2-ba2d-4825-a01c-79d5eb2fc912","Type":"ContainerStarted","Data":"b45ddf91df8338146744c4aa84c8ef6c39c7467c14ccf02f7b03a340965615cc"} Oct 06 12:11:41 crc kubenswrapper[4892]: I1006 12:11:41.035435 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zd5m" event={"ID":"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac","Type":"ContainerStarted","Data":"c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c"} Oct 06 12:11:41 crc kubenswrapper[4892]: E1006 12:11:41.036177 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cdlc5" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" Oct 06 12:11:41 crc kubenswrapper[4892]: E1006 12:11:41.036624 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cf27c" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.044645 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bf88v" event={"ID":"d042dea2-ba2d-4825-a01c-79d5eb2fc912","Type":"ContainerStarted","Data":"442a026b35819738ea64beb46f8ccf794e28f9b09c1089df463a41f56cf54aa5"} Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.045060 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bf88v" event={"ID":"d042dea2-ba2d-4825-a01c-79d5eb2fc912","Type":"ContainerStarted","Data":"64f587f550b2b413314a9f475d78bfc1a1482583d8d14117f9d4ed4e02a4b76a"} Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.049059 4892 generic.go:334] "Generic (PLEG): container finished" podID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerID="c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c" exitCode=0 Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.049127 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zd5m" event={"ID":"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac","Type":"ContainerDied","Data":"c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c"} Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.063163 4892 generic.go:334] "Generic (PLEG): container finished" podID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerID="32b0b2782c392803265199cf991930fcf0b29093b1fcc3fb4aa31b800471923a" exitCode=0 Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.063226 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfx6" event={"ID":"64484fe6-cd8c-492d-9fd5-19dc11f559b8","Type":"ContainerDied","Data":"32b0b2782c392803265199cf991930fcf0b29093b1fcc3fb4aa31b800471923a"} Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.071971 4892 generic.go:334] "Generic (PLEG): container finished" podID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerID="cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323" exitCode=0 Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.072027 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s2rt" event={"ID":"27e60858-52de-4a1a-aa13-c3cd5b23747d","Type":"ContainerDied","Data":"cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323"} Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.077529 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bf88v" podStartSLOduration=167.077498206 podStartE2EDuration="2m47.077498206s" podCreationTimestamp="2025-10-06 12:08:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:11:42.070230357 +0000 UTC m=+188.619936142" watchObservedRunningTime="2025-10-06 12:11:42.077498206 +0000 UTC m=+188.627204011" Oct 06 12:11:42 crc kubenswrapper[4892]: I1006 12:11:42.397821 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 06 12:11:43 crc kubenswrapper[4892]: I1006 12:11:43.082182 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfx6" event={"ID":"64484fe6-cd8c-492d-9fd5-19dc11f559b8","Type":"ContainerStarted","Data":"74a40dc244c9a4496dfb8c969a31fe16ac70e32affe0164a33a167b0655e4756"} Oct 06 12:11:43 crc kubenswrapper[4892]: I1006 12:11:43.086565 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s2rt" event={"ID":"27e60858-52de-4a1a-aa13-c3cd5b23747d","Type":"ContainerStarted","Data":"afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b"} Oct 06 12:11:43 crc kubenswrapper[4892]: I1006 12:11:43.110726 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-phfx6" podStartSLOduration=3.371260522 podStartE2EDuration="37.110709066s" podCreationTimestamp="2025-10-06 12:11:06 +0000 UTC" firstStartedPulling="2025-10-06 12:11:08.809777698 +0000 UTC m=+155.359483463" lastFinishedPulling="2025-10-06 12:11:42.549226242 +0000 UTC m=+189.098932007" observedRunningTime="2025-10-06 12:11:43.107390369 +0000 UTC m=+189.657096144" watchObservedRunningTime="2025-10-06 12:11:43.110709066 +0000 UTC m=+189.660414841" Oct 06 12:11:43 crc kubenswrapper[4892]: I1006 12:11:43.138496 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8s2rt" podStartSLOduration=2.125505321 podStartE2EDuration="37.138467056s" podCreationTimestamp="2025-10-06 12:11:06 +0000 UTC" firstStartedPulling="2025-10-06 12:11:07.753922208 +0000 UTC m=+154.303627973" lastFinishedPulling="2025-10-06 12:11:42.766883933 +0000 UTC m=+189.316589708" observedRunningTime="2025-10-06 12:11:43.134706331 +0000 UTC m=+189.684412136" watchObservedRunningTime="2025-10-06 12:11:43.138467056 +0000 UTC m=+189.688172851" Oct 06 12:11:44 crc kubenswrapper[4892]: I1006 12:11:44.093862 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zd5m" event={"ID":"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac","Type":"ContainerStarted","Data":"d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce"} Oct 06 12:11:44 crc kubenswrapper[4892]: I1006 12:11:44.121660 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2zd5m" podStartSLOduration=2.557028329 podStartE2EDuration="41.12164361s" podCreationTimestamp="2025-10-06 12:11:03 +0000 UTC" firstStartedPulling="2025-10-06 12:11:04.643793488 +0000 UTC m=+151.193499253" lastFinishedPulling="2025-10-06 12:11:43.208408729 +0000 UTC m=+189.758114534" observedRunningTime="2025-10-06 12:11:44.119027883 +0000 UTC m=+190.668733658" watchObservedRunningTime="2025-10-06 12:11:44.12164361 +0000 UTC m=+190.671349375" Oct 06 12:11:46 crc kubenswrapper[4892]: I1006 12:11:46.873926 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:46 crc kubenswrapper[4892]: I1006 12:11:46.874381 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:47 crc kubenswrapper[4892]: I1006 12:11:47.270225 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:47 crc kubenswrapper[4892]: I1006 12:11:47.271043 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:48 crc kubenswrapper[4892]: I1006 12:11:48.046769 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8s2rt" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="registry-server" probeResult="failure" output=< Oct 06 12:11:48 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Oct 06 12:11:48 crc kubenswrapper[4892]: > Oct 06 12:11:48 crc kubenswrapper[4892]: I1006 12:11:48.309059 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-phfx6" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="registry-server" probeResult="failure" output=< Oct 06 12:11:48 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Oct 06 12:11:48 crc kubenswrapper[4892]: > Oct 06 12:11:52 crc kubenswrapper[4892]: I1006 12:11:52.141767 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtgsp" event={"ID":"bfaba287-8034-4e1b-a4cd-f6c7962f9d45","Type":"ContainerStarted","Data":"785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d"} Oct 06 12:11:52 crc kubenswrapper[4892]: I1006 12:11:52.985054 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:11:52 crc kubenswrapper[4892]: I1006 12:11:52.985131 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:11:53 crc kubenswrapper[4892]: I1006 12:11:53.152747 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtgsp" event={"ID":"bfaba287-8034-4e1b-a4cd-f6c7962f9d45","Type":"ContainerDied","Data":"785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d"} Oct 06 12:11:53 crc kubenswrapper[4892]: I1006 12:11:53.152835 4892 generic.go:334] "Generic (PLEG): container finished" podID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerID="785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d" exitCode=0 Oct 06 12:11:53 crc kubenswrapper[4892]: I1006 12:11:53.672512 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:53 crc kubenswrapper[4892]: I1006 12:11:53.672904 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:53 crc kubenswrapper[4892]: I1006 12:11:53.775941 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:54 crc kubenswrapper[4892]: I1006 12:11:54.212660 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:11:56 crc kubenswrapper[4892]: I1006 12:11:56.176469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdlc5" event={"ID":"93d5a0ee-784b-45f0-bdad-5df1f824f031","Type":"ContainerStarted","Data":"00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4"} Oct 06 12:11:56 crc kubenswrapper[4892]: I1006 12:11:56.179089 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtgsp" event={"ID":"bfaba287-8034-4e1b-a4cd-f6c7962f9d45","Type":"ContainerStarted","Data":"c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277"} Oct 06 12:11:56 crc kubenswrapper[4892]: I1006 12:11:56.182248 4892 generic.go:334] "Generic (PLEG): container finished" podID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerID="e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4" exitCode=0 Oct 06 12:11:56 crc kubenswrapper[4892]: I1006 12:11:56.182334 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf27c" event={"ID":"8e102df5-64d4-4682-9cb1-22a7165f4294","Type":"ContainerDied","Data":"e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4"} Oct 06 12:11:56 crc kubenswrapper[4892]: I1006 12:11:56.228944 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mtgsp" podStartSLOduration=3.55823393 podStartE2EDuration="53.22892916s" podCreationTimestamp="2025-10-06 12:11:03 +0000 UTC" firstStartedPulling="2025-10-06 12:11:05.658817091 +0000 UTC m=+152.208522856" lastFinishedPulling="2025-10-06 12:11:55.329512291 +0000 UTC m=+201.879218086" observedRunningTime="2025-10-06 12:11:56.227319523 +0000 UTC m=+202.777025288" watchObservedRunningTime="2025-10-06 12:11:56.22892916 +0000 UTC m=+202.778634915" Oct 06 12:11:56 crc kubenswrapper[4892]: I1006 12:11:56.928807 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:56 crc kubenswrapper[4892]: I1006 12:11:56.983476 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.189937 4892 generic.go:334] "Generic (PLEG): container finished" podID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerID="00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4" exitCode=0 Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.189992 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdlc5" event={"ID":"93d5a0ee-784b-45f0-bdad-5df1f824f031","Type":"ContainerDied","Data":"00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4"} Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.195045 4892 generic.go:334] "Generic (PLEG): container finished" podID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerID="37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9" exitCode=0 Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.195120 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swbp7" event={"ID":"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc","Type":"ContainerDied","Data":"37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9"} Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.198223 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf27c" event={"ID":"8e102df5-64d4-4682-9cb1-22a7165f4294","Type":"ContainerStarted","Data":"fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1"} Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.201080 4892 generic.go:334] "Generic (PLEG): container finished" podID="84310366-fec4-4521-a296-7fdba4b65821" containerID="e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b" exitCode=0 Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.201161 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44kmf" event={"ID":"84310366-fec4-4521-a296-7fdba4b65821","Type":"ContainerDied","Data":"e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b"} Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.272737 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cf27c" podStartSLOduration=2.306821006 podStartE2EDuration="52.272722501s" podCreationTimestamp="2025-10-06 12:11:05 +0000 UTC" firstStartedPulling="2025-10-06 12:11:06.689491156 +0000 UTC m=+153.239196921" lastFinishedPulling="2025-10-06 12:11:56.655392651 +0000 UTC m=+203.205098416" observedRunningTime="2025-10-06 12:11:57.267961494 +0000 UTC m=+203.817667259" watchObservedRunningTime="2025-10-06 12:11:57.272722501 +0000 UTC m=+203.822428266" Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.323038 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:57 crc kubenswrapper[4892]: I1006 12:11:57.368240 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:11:58 crc kubenswrapper[4892]: I1006 12:11:58.208191 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swbp7" event={"ID":"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc","Type":"ContainerStarted","Data":"e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c"} Oct 06 12:11:58 crc kubenswrapper[4892]: I1006 12:11:58.211345 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44kmf" event={"ID":"84310366-fec4-4521-a296-7fdba4b65821","Type":"ContainerStarted","Data":"3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec"} Oct 06 12:11:58 crc kubenswrapper[4892]: I1006 12:11:58.213470 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdlc5" event={"ID":"93d5a0ee-784b-45f0-bdad-5df1f824f031","Type":"ContainerStarted","Data":"828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6"} Oct 06 12:11:58 crc kubenswrapper[4892]: I1006 12:11:58.233302 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-swbp7" podStartSLOduration=3.111679256 podStartE2EDuration="55.2332874s" podCreationTimestamp="2025-10-06 12:11:03 +0000 UTC" firstStartedPulling="2025-10-06 12:11:05.663763384 +0000 UTC m=+152.213469159" lastFinishedPulling="2025-10-06 12:11:57.785371528 +0000 UTC m=+204.335077303" observedRunningTime="2025-10-06 12:11:58.229993635 +0000 UTC m=+204.779699410" watchObservedRunningTime="2025-10-06 12:11:58.2332874 +0000 UTC m=+204.782993165" Oct 06 12:11:58 crc kubenswrapper[4892]: I1006 12:11:58.249816 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-44kmf" podStartSLOduration=3.065233556 podStartE2EDuration="55.249797058s" podCreationTimestamp="2025-10-06 12:11:03 +0000 UTC" firstStartedPulling="2025-10-06 12:11:05.662839196 +0000 UTC m=+152.212544961" lastFinishedPulling="2025-10-06 12:11:57.847402698 +0000 UTC m=+204.397108463" observedRunningTime="2025-10-06 12:11:58.246385058 +0000 UTC m=+204.796090823" watchObservedRunningTime="2025-10-06 12:11:58.249797058 +0000 UTC m=+204.799502833" Oct 06 12:11:58 crc kubenswrapper[4892]: I1006 12:11:58.264713 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cdlc5" podStartSLOduration=3.329063849 podStartE2EDuration="53.264693399s" podCreationTimestamp="2025-10-06 12:11:05 +0000 UTC" firstStartedPulling="2025-10-06 12:11:07.729033565 +0000 UTC m=+154.278739320" lastFinishedPulling="2025-10-06 12:11:57.664663105 +0000 UTC m=+204.214368870" observedRunningTime="2025-10-06 12:11:58.26243122 +0000 UTC m=+204.812136975" watchObservedRunningTime="2025-10-06 12:11:58.264693399 +0000 UTC m=+204.814399174" Oct 06 12:12:00 crc kubenswrapper[4892]: I1006 12:12:00.609763 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phfx6"] Oct 06 12:12:00 crc kubenswrapper[4892]: I1006 12:12:00.610683 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-phfx6" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="registry-server" containerID="cri-o://74a40dc244c9a4496dfb8c969a31fe16ac70e32affe0164a33a167b0655e4756" gracePeriod=2 Oct 06 12:12:01 crc kubenswrapper[4892]: I1006 12:12:01.237499 4892 generic.go:334] "Generic (PLEG): container finished" podID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerID="74a40dc244c9a4496dfb8c969a31fe16ac70e32affe0164a33a167b0655e4756" exitCode=0 Oct 06 12:12:01 crc kubenswrapper[4892]: I1006 12:12:01.237584 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfx6" event={"ID":"64484fe6-cd8c-492d-9fd5-19dc11f559b8","Type":"ContainerDied","Data":"74a40dc244c9a4496dfb8c969a31fe16ac70e32affe0164a33a167b0655e4756"} Oct 06 12:12:01 crc kubenswrapper[4892]: I1006 12:12:01.964369 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.062536 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-catalog-content\") pod \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.062640 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-utilities\") pod \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.062704 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljp5t\" (UniqueName: \"kubernetes.io/projected/64484fe6-cd8c-492d-9fd5-19dc11f559b8-kube-api-access-ljp5t\") pod \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\" (UID: \"64484fe6-cd8c-492d-9fd5-19dc11f559b8\") " Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.063560 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-utilities" (OuterVolumeSpecName: "utilities") pod "64484fe6-cd8c-492d-9fd5-19dc11f559b8" (UID: "64484fe6-cd8c-492d-9fd5-19dc11f559b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.068292 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64484fe6-cd8c-492d-9fd5-19dc11f559b8-kube-api-access-ljp5t" (OuterVolumeSpecName: "kube-api-access-ljp5t") pod "64484fe6-cd8c-492d-9fd5-19dc11f559b8" (UID: "64484fe6-cd8c-492d-9fd5-19dc11f559b8"). InnerVolumeSpecName "kube-api-access-ljp5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.147753 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64484fe6-cd8c-492d-9fd5-19dc11f559b8" (UID: "64484fe6-cd8c-492d-9fd5-19dc11f559b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.164143 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.164191 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljp5t\" (UniqueName: \"kubernetes.io/projected/64484fe6-cd8c-492d-9fd5-19dc11f559b8-kube-api-access-ljp5t\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.164211 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64484fe6-cd8c-492d-9fd5-19dc11f559b8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.245881 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-phfx6" event={"ID":"64484fe6-cd8c-492d-9fd5-19dc11f559b8","Type":"ContainerDied","Data":"fdb8d604fb779a3560cf10b28b84885be2469e7ee6f3017bb6ef6a860c679537"} Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.245936 4892 scope.go:117] "RemoveContainer" containerID="74a40dc244c9a4496dfb8c969a31fe16ac70e32affe0164a33a167b0655e4756" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.245991 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-phfx6" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.265047 4892 scope.go:117] "RemoveContainer" containerID="32b0b2782c392803265199cf991930fcf0b29093b1fcc3fb4aa31b800471923a" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.287195 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-phfx6"] Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.292600 4892 scope.go:117] "RemoveContainer" containerID="bcb5391038845af2610fa1a24986e76e359573c8e8eeb107144bb3718e84ab4d" Oct 06 12:12:02 crc kubenswrapper[4892]: I1006 12:12:02.294171 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-phfx6"] Oct 06 12:12:03 crc kubenswrapper[4892]: I1006 12:12:03.898747 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:12:03 crc kubenswrapper[4892]: I1006 12:12:03.899083 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:12:03 crc kubenswrapper[4892]: I1006 12:12:03.946951 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.084605 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.084684 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.134680 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.176164 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" path="/var/lib/kubelet/pods/64484fe6-cd8c-492d-9fd5-19dc11f559b8/volumes" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.296120 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.328245 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.328302 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.352157 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:12:04 crc kubenswrapper[4892]: I1006 12:12:04.379099 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:12:05 crc kubenswrapper[4892]: I1006 12:12:05.296643 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:12:05 crc kubenswrapper[4892]: I1006 12:12:05.880877 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:12:05 crc kubenswrapper[4892]: I1006 12:12:05.880958 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:12:05 crc kubenswrapper[4892]: I1006 12:12:05.959148 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.010932 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtgsp"] Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.265411 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mtgsp" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerName="registry-server" containerID="cri-o://c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277" gracePeriod=2 Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.272623 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.272673 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.313640 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.325541 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.605106 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.732239 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-utilities\") pod \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.732342 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2jfd\" (UniqueName: \"kubernetes.io/projected/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-kube-api-access-m2jfd\") pod \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.732424 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-catalog-content\") pod \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\" (UID: \"bfaba287-8034-4e1b-a4cd-f6c7962f9d45\") " Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.733152 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-utilities" (OuterVolumeSpecName: "utilities") pod "bfaba287-8034-4e1b-a4cd-f6c7962f9d45" (UID: "bfaba287-8034-4e1b-a4cd-f6c7962f9d45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.740102 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-kube-api-access-m2jfd" (OuterVolumeSpecName: "kube-api-access-m2jfd") pod "bfaba287-8034-4e1b-a4cd-f6c7962f9d45" (UID: "bfaba287-8034-4e1b-a4cd-f6c7962f9d45"). InnerVolumeSpecName "kube-api-access-m2jfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.777055 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfaba287-8034-4e1b-a4cd-f6c7962f9d45" (UID: "bfaba287-8034-4e1b-a4cd-f6c7962f9d45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.833680 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2jfd\" (UniqueName: \"kubernetes.io/projected/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-kube-api-access-m2jfd\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.833709 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:06 crc kubenswrapper[4892]: I1006 12:12:06.833720 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfaba287-8034-4e1b-a4cd-f6c7962f9d45-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.281502 4892 generic.go:334] "Generic (PLEG): container finished" podID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerID="c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277" exitCode=0 Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.282431 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mtgsp" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.285442 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtgsp" event={"ID":"bfaba287-8034-4e1b-a4cd-f6c7962f9d45","Type":"ContainerDied","Data":"c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277"} Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.285473 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mtgsp" event={"ID":"bfaba287-8034-4e1b-a4cd-f6c7962f9d45","Type":"ContainerDied","Data":"0351ed4c868e9fe4e97cffe0a525f2c737da2424fcbcdde3337d214d734a6a63"} Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.285489 4892 scope.go:117] "RemoveContainer" containerID="c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.307636 4892 scope.go:117] "RemoveContainer" containerID="785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.309116 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mtgsp"] Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.312508 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mtgsp"] Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.335522 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.338875 4892 scope.go:117] "RemoveContainer" containerID="cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.355028 4892 scope.go:117] "RemoveContainer" containerID="c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277" Oct 06 12:12:07 crc kubenswrapper[4892]: E1006 12:12:07.355566 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277\": container with ID starting with c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277 not found: ID does not exist" containerID="c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.355600 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277"} err="failed to get container status \"c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277\": rpc error: code = NotFound desc = could not find container \"c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277\": container with ID starting with c36d7ce2700e9b7cf900641543cbaf400ce20fe4a0b8ee25eb6cd6e0646e6277 not found: ID does not exist" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.355639 4892 scope.go:117] "RemoveContainer" containerID="785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d" Oct 06 12:12:07 crc kubenswrapper[4892]: E1006 12:12:07.355903 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d\": container with ID starting with 785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d not found: ID does not exist" containerID="785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.355945 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d"} err="failed to get container status \"785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d\": rpc error: code = NotFound desc = could not find container \"785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d\": container with ID starting with 785ed352c8c8f5158a049d64061b1d7990be415c272a8bf268e2cc7f2978f47d not found: ID does not exist" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.355976 4892 scope.go:117] "RemoveContainer" containerID="cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b" Oct 06 12:12:07 crc kubenswrapper[4892]: E1006 12:12:07.356389 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b\": container with ID starting with cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b not found: ID does not exist" containerID="cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.356412 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b"} err="failed to get container status \"cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b\": rpc error: code = NotFound desc = could not find container \"cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b\": container with ID starting with cecad714da8713011c3bca9932293f3c36aea3f03d884f66bc957c33aa9ada8b not found: ID does not exist" Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.407515 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swbp7"] Oct 06 12:12:07 crc kubenswrapper[4892]: I1006 12:12:07.407745 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-swbp7" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerName="registry-server" containerID="cri-o://e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c" gracePeriod=2 Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.174242 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" path="/var/lib/kubelet/pods/bfaba287-8034-4e1b-a4cd-f6c7962f9d45/volumes" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.266192 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.288829 4892 generic.go:334] "Generic (PLEG): container finished" podID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerID="e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c" exitCode=0 Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.288904 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swbp7" event={"ID":"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc","Type":"ContainerDied","Data":"e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c"} Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.288925 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swbp7" event={"ID":"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc","Type":"ContainerDied","Data":"7a988d017863fbf170c9b2e64a45fcc6e688c1fced20b9132f86c63120fba0f5"} Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.288928 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swbp7" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.288943 4892 scope.go:117] "RemoveContainer" containerID="e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.303385 4892 scope.go:117] "RemoveContainer" containerID="37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.315842 4892 scope.go:117] "RemoveContainer" containerID="8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.334518 4892 scope.go:117] "RemoveContainer" containerID="e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c" Oct 06 12:12:08 crc kubenswrapper[4892]: E1006 12:12:08.334999 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c\": container with ID starting with e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c not found: ID does not exist" containerID="e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.335049 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c"} err="failed to get container status \"e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c\": rpc error: code = NotFound desc = could not find container \"e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c\": container with ID starting with e63b30dbe36e30ec1b4bf13561ea9cda3e7e87591cfce2017aa92bae7bbf4c2c not found: ID does not exist" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.335082 4892 scope.go:117] "RemoveContainer" containerID="37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9" Oct 06 12:12:08 crc kubenswrapper[4892]: E1006 12:12:08.335500 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9\": container with ID starting with 37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9 not found: ID does not exist" containerID="37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.335530 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9"} err="failed to get container status \"37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9\": rpc error: code = NotFound desc = could not find container \"37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9\": container with ID starting with 37e047ee77c20cbb4113600be8a860ee721169cc8b0d2625cbfc30b12211dcb9 not found: ID does not exist" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.335548 4892 scope.go:117] "RemoveContainer" containerID="8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050" Oct 06 12:12:08 crc kubenswrapper[4892]: E1006 12:12:08.335830 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050\": container with ID starting with 8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050 not found: ID does not exist" containerID="8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.335855 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050"} err="failed to get container status \"8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050\": rpc error: code = NotFound desc = could not find container \"8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050\": container with ID starting with 8dafc091962edb964ef1f427f405bf52f75da1205dd2c8570f52f814621a9050 not found: ID does not exist" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.350411 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-utilities\") pod \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.350468 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-catalog-content\") pod \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.350587 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smnz9\" (UniqueName: \"kubernetes.io/projected/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-kube-api-access-smnz9\") pod \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\" (UID: \"9e19c6b4-23cb-4864-9470-ef8acaa1f5fc\") " Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.352187 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-utilities" (OuterVolumeSpecName: "utilities") pod "9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" (UID: "9e19c6b4-23cb-4864-9470-ef8acaa1f5fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.358190 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-kube-api-access-smnz9" (OuterVolumeSpecName: "kube-api-access-smnz9") pod "9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" (UID: "9e19c6b4-23cb-4864-9470-ef8acaa1f5fc"). InnerVolumeSpecName "kube-api-access-smnz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.395646 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" (UID: "9e19c6b4-23cb-4864-9470-ef8acaa1f5fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.407979 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdlc5"] Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.451566 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smnz9\" (UniqueName: \"kubernetes.io/projected/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-kube-api-access-smnz9\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.451603 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.451616 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.614214 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swbp7"] Oct 06 12:12:08 crc kubenswrapper[4892]: I1006 12:12:08.618372 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-swbp7"] Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.295137 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cdlc5" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerName="registry-server" containerID="cri-o://828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6" gracePeriod=2 Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.609296 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.663353 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-catalog-content\") pod \"93d5a0ee-784b-45f0-bdad-5df1f824f031\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.663444 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-utilities\") pod \"93d5a0ee-784b-45f0-bdad-5df1f824f031\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.663480 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2r6g\" (UniqueName: \"kubernetes.io/projected/93d5a0ee-784b-45f0-bdad-5df1f824f031-kube-api-access-k2r6g\") pod \"93d5a0ee-784b-45f0-bdad-5df1f824f031\" (UID: \"93d5a0ee-784b-45f0-bdad-5df1f824f031\") " Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.664112 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-utilities" (OuterVolumeSpecName: "utilities") pod "93d5a0ee-784b-45f0-bdad-5df1f824f031" (UID: "93d5a0ee-784b-45f0-bdad-5df1f824f031"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.668704 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d5a0ee-784b-45f0-bdad-5df1f824f031-kube-api-access-k2r6g" (OuterVolumeSpecName: "kube-api-access-k2r6g") pod "93d5a0ee-784b-45f0-bdad-5df1f824f031" (UID: "93d5a0ee-784b-45f0-bdad-5df1f824f031"). InnerVolumeSpecName "kube-api-access-k2r6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.676043 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93d5a0ee-784b-45f0-bdad-5df1f824f031" (UID: "93d5a0ee-784b-45f0-bdad-5df1f824f031"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.764441 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.764670 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2r6g\" (UniqueName: \"kubernetes.io/projected/93d5a0ee-784b-45f0-bdad-5df1f824f031-kube-api-access-k2r6g\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:09 crc kubenswrapper[4892]: I1006 12:12:09.764733 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93d5a0ee-784b-45f0-bdad-5df1f824f031-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.176180 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" path="/var/lib/kubelet/pods/9e19c6b4-23cb-4864-9470-ef8acaa1f5fc/volumes" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.301496 4892 generic.go:334] "Generic (PLEG): container finished" podID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerID="828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6" exitCode=0 Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.301552 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdlc5" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.301572 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdlc5" event={"ID":"93d5a0ee-784b-45f0-bdad-5df1f824f031","Type":"ContainerDied","Data":"828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6"} Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.301954 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdlc5" event={"ID":"93d5a0ee-784b-45f0-bdad-5df1f824f031","Type":"ContainerDied","Data":"996a9b8e103079f59ae0cff60076b212c577d873fcdd12c4f9cec431084312ed"} Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.302032 4892 scope.go:117] "RemoveContainer" containerID="828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.322094 4892 scope.go:117] "RemoveContainer" containerID="00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.326780 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdlc5"] Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.330620 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdlc5"] Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.334274 4892 scope.go:117] "RemoveContainer" containerID="b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.355531 4892 scope.go:117] "RemoveContainer" containerID="828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6" Oct 06 12:12:10 crc kubenswrapper[4892]: E1006 12:12:10.356030 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6\": container with ID starting with 828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6 not found: ID does not exist" containerID="828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.356119 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6"} err="failed to get container status \"828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6\": rpc error: code = NotFound desc = could not find container \"828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6\": container with ID starting with 828cae4df6ea5cba452ebe2b9e40db02f375421d02b32d84a0bc6cda43045dc6 not found: ID does not exist" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.356166 4892 scope.go:117] "RemoveContainer" containerID="00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4" Oct 06 12:12:10 crc kubenswrapper[4892]: E1006 12:12:10.356585 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4\": container with ID starting with 00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4 not found: ID does not exist" containerID="00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.356617 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4"} err="failed to get container status \"00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4\": rpc error: code = NotFound desc = could not find container \"00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4\": container with ID starting with 00f5ddaf9da970025b2583d668e8a72cb34c4533cd06ad5d30d6f290869838f4 not found: ID does not exist" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.356638 4892 scope.go:117] "RemoveContainer" containerID="b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c" Oct 06 12:12:10 crc kubenswrapper[4892]: E1006 12:12:10.356999 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c\": container with ID starting with b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c not found: ID does not exist" containerID="b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c" Oct 06 12:12:10 crc kubenswrapper[4892]: I1006 12:12:10.357029 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c"} err="failed to get container status \"b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c\": rpc error: code = NotFound desc = could not find container \"b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c\": container with ID starting with b72f3304681b17bf73d6bbc03e6c23c3e9b562b6a9520702047a37247f664a1c not found: ID does not exist" Oct 06 12:12:12 crc kubenswrapper[4892]: I1006 12:12:12.174135 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" path="/var/lib/kubelet/pods/93d5a0ee-784b-45f0-bdad-5df1f824f031/volumes" Oct 06 12:12:15 crc kubenswrapper[4892]: I1006 12:12:15.024092 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9q95"] Oct 06 12:12:22 crc kubenswrapper[4892]: I1006 12:12:22.984597 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:12:22 crc kubenswrapper[4892]: I1006 12:12:22.985580 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:12:22 crc kubenswrapper[4892]: I1006 12:12:22.985668 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:12:22 crc kubenswrapper[4892]: I1006 12:12:22.986794 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:12:22 crc kubenswrapper[4892]: I1006 12:12:22.986956 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493" gracePeriod=600 Oct 06 12:12:23 crc kubenswrapper[4892]: I1006 12:12:23.377734 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493" exitCode=0 Oct 06 12:12:23 crc kubenswrapper[4892]: I1006 12:12:23.377863 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493"} Oct 06 12:12:23 crc kubenswrapper[4892]: I1006 12:12:23.378120 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"fe252163f7a2babdff1e10cc57f09f2bd93ebf81d17649e7aa36213a49197603"} Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.056754 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" podUID="d6cd9565-520a-47d6-bb93-7423147863ef" containerName="oauth-openshift" containerID="cri-o://a949af274614a59b0c009b2e3c84e712848fd14a10d454b5ed09880b3b9b0713" gracePeriod=15 Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.507760 4892 generic.go:334] "Generic (PLEG): container finished" podID="d6cd9565-520a-47d6-bb93-7423147863ef" containerID="a949af274614a59b0c009b2e3c84e712848fd14a10d454b5ed09880b3b9b0713" exitCode=0 Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.507826 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" event={"ID":"d6cd9565-520a-47d6-bb93-7423147863ef","Type":"ContainerDied","Data":"a949af274614a59b0c009b2e3c84e712848fd14a10d454b5ed09880b3b9b0713"} Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.590193 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.622976 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4"] Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623152 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerName="extract-content" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623163 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerName="extract-content" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623175 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623181 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623190 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerName="extract-content" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623196 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerName="extract-content" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623205 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="extract-utilities" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623211 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="extract-utilities" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623220 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cd9565-520a-47d6-bb93-7423147863ef" containerName="oauth-openshift" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623226 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cd9565-520a-47d6-bb93-7423147863ef" containerName="oauth-openshift" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623234 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerName="extract-content" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623240 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerName="extract-content" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623248 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerName="extract-utilities" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623253 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerName="extract-utilities" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623263 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerName="extract-utilities" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623269 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerName="extract-utilities" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623279 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="extract-content" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623285 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="extract-content" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623290 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623298 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623304 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824b68fe-6b22-44a5-98d1-1db5623d17c9" containerName="pruner" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623310 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="824b68fe-6b22-44a5-98d1-1db5623d17c9" containerName="pruner" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623318 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerName="extract-utilities" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623342 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerName="extract-utilities" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623349 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623355 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: E1006 12:12:40.623365 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623371 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623455 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="824b68fe-6b22-44a5-98d1-1db5623d17c9" containerName="pruner" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623465 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfaba287-8034-4e1b-a4cd-f6c7962f9d45" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623472 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="64484fe6-cd8c-492d-9fd5-19dc11f559b8" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623482 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d5a0ee-784b-45f0-bdad-5df1f824f031" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623492 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cd9565-520a-47d6-bb93-7423147863ef" containerName="oauth-openshift" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623497 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e19c6b4-23cb-4864-9470-ef8acaa1f5fc" containerName="registry-server" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.623829 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.649236 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4"] Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672195 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-router-certs\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672260 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-serving-cert\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672314 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-service-ca\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672417 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-provider-selection\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672458 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d6cd9565-520a-47d6-bb93-7423147863ef-audit-dir\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672508 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-cliconfig\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672553 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-audit-policies\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672552 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6cd9565-520a-47d6-bb93-7423147863ef-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672593 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dhgn\" (UniqueName: \"kubernetes.io/projected/d6cd9565-520a-47d6-bb93-7423147863ef-kube-api-access-8dhgn\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672648 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-trusted-ca-bundle\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672689 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-ocp-branding-template\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672722 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-session\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672756 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-login\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672793 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-idp-0-file-data\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672841 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-error\") pod \"d6cd9565-520a-47d6-bb93-7423147863ef\" (UID: \"d6cd9565-520a-47d6-bb93-7423147863ef\") " Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.672998 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-audit-policies\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673070 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673106 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673142 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673180 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673226 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673260 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673298 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673361 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673392 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673431 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkvjx\" (UniqueName: \"kubernetes.io/projected/509266d2-5ed7-4929-8187-474370a4a96d-kube-api-access-qkvjx\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673477 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673522 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/509266d2-5ed7-4929-8187-474370a4a96d-audit-dir\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673558 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-session\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673626 4892 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d6cd9565-520a-47d6-bb93-7423147863ef-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673230 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673242 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.673661 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.674047 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.680530 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.681828 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.681945 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.682249 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.682625 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.684179 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.685736 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.687899 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cd9565-520a-47d6-bb93-7423147863ef-kube-api-access-8dhgn" (OuterVolumeSpecName: "kube-api-access-8dhgn") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "kube-api-access-8dhgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.688102 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d6cd9565-520a-47d6-bb93-7423147863ef" (UID: "d6cd9565-520a-47d6-bb93-7423147863ef"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774174 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774258 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774351 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774380 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774401 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774461 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkvjx\" (UniqueName: \"kubernetes.io/projected/509266d2-5ed7-4929-8187-474370a4a96d-kube-api-access-qkvjx\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774517 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774615 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/509266d2-5ed7-4929-8187-474370a4a96d-audit-dir\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-session\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774709 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-audit-policies\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774774 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774797 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774819 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774891 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774930 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774944 4892 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774957 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dhgn\" (UniqueName: \"kubernetes.io/projected/d6cd9565-520a-47d6-bb93-7423147863ef-kube-api-access-8dhgn\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774971 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.774983 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.775020 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.775035 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.775047 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.775061 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.775100 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.775111 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.775124 4892 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d6cd9565-520a-47d6-bb93-7423147863ef-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.776743 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-audit-policies\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.776830 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/509266d2-5ed7-4929-8187-474370a4a96d-audit-dir\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.777778 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-service-ca\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.777856 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.780637 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.780637 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.781030 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.782372 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-router-certs\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.783254 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.783685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-error\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.785531 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.785603 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-system-session\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.786665 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/509266d2-5ed7-4929-8187-474370a4a96d-v4-0-config-user-template-login\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.800263 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkvjx\" (UniqueName: \"kubernetes.io/projected/509266d2-5ed7-4929-8187-474370a4a96d-kube-api-access-qkvjx\") pod \"oauth-openshift-68b6dd9b65-z7vt4\" (UID: \"509266d2-5ed7-4929-8187-474370a4a96d\") " pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:40 crc kubenswrapper[4892]: I1006 12:12:40.946112 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:41 crc kubenswrapper[4892]: I1006 12:12:41.440631 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4"] Oct 06 12:12:41 crc kubenswrapper[4892]: I1006 12:12:41.517620 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" event={"ID":"d6cd9565-520a-47d6-bb93-7423147863ef","Type":"ContainerDied","Data":"ae399727ed570f0fdaab4edc55dc2108195f40d7c47b28beaeb528cc68dc6547"} Oct 06 12:12:41 crc kubenswrapper[4892]: I1006 12:12:41.517701 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-h9q95" Oct 06 12:12:41 crc kubenswrapper[4892]: I1006 12:12:41.517711 4892 scope.go:117] "RemoveContainer" containerID="a949af274614a59b0c009b2e3c84e712848fd14a10d454b5ed09880b3b9b0713" Oct 06 12:12:41 crc kubenswrapper[4892]: I1006 12:12:41.521860 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" event={"ID":"509266d2-5ed7-4929-8187-474370a4a96d","Type":"ContainerStarted","Data":"cb55d8c253e76a01fd9842624cbeeca86952bb7f5f837398882420c27c38c9dc"} Oct 06 12:12:41 crc kubenswrapper[4892]: I1006 12:12:41.565537 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9q95"] Oct 06 12:12:41 crc kubenswrapper[4892]: I1006 12:12:41.569825 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-h9q95"] Oct 06 12:12:42 crc kubenswrapper[4892]: I1006 12:12:42.185121 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6cd9565-520a-47d6-bb93-7423147863ef" path="/var/lib/kubelet/pods/d6cd9565-520a-47d6-bb93-7423147863ef/volumes" Oct 06 12:12:42 crc kubenswrapper[4892]: I1006 12:12:42.535788 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" event={"ID":"509266d2-5ed7-4929-8187-474370a4a96d","Type":"ContainerStarted","Data":"fc930590d5f3c1748c763272b5a92b983411f7a40015c1905d7c8c79f0ddd33d"} Oct 06 12:12:42 crc kubenswrapper[4892]: I1006 12:12:42.536377 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:42 crc kubenswrapper[4892]: I1006 12:12:42.546042 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" Oct 06 12:12:42 crc kubenswrapper[4892]: I1006 12:12:42.582080 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68b6dd9b65-z7vt4" podStartSLOduration=27.582055676 podStartE2EDuration="27.582055676s" podCreationTimestamp="2025-10-06 12:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:12:42.579955702 +0000 UTC m=+249.129661507" watchObservedRunningTime="2025-10-06 12:12:42.582055676 +0000 UTC m=+249.131761471" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.184622 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44kmf"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.185276 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-44kmf" podUID="84310366-fec4-4521-a296-7fdba4b65821" containerName="registry-server" containerID="cri-o://3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec" gracePeriod=30 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.196791 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2zd5m"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.197013 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2zd5m" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerName="registry-server" containerID="cri-o://d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce" gracePeriod=30 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.209002 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4rs"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.209199 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" podUID="aac1222e-f92a-4345-8ca2-125d2d2c2627" containerName="marketplace-operator" containerID="cri-o://6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf" gracePeriod=30 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.216182 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf27c"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.216587 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cf27c" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerName="registry-server" containerID="cri-o://fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1" gracePeriod=30 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.227112 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8s2rt"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.227490 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8s2rt" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="registry-server" containerID="cri-o://afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b" gracePeriod=30 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.233814 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bdjg5"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.234624 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.248210 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bdjg5"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.370749 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.370844 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.370889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvdr\" (UniqueName: \"kubernetes.io/projected/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-kube-api-access-qlvdr\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.472231 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvdr\" (UniqueName: \"kubernetes.io/projected/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-kube-api-access-qlvdr\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.472289 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.472353 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.473787 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.478717 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.488122 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvdr\" (UniqueName: \"kubernetes.io/projected/df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a-kube-api-access-qlvdr\") pod \"marketplace-operator-79b997595-bdjg5\" (UID: \"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a\") " pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.624757 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.640727 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.644895 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.649776 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.654078 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.681819 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz56n\" (UniqueName: \"kubernetes.io/projected/27e60858-52de-4a1a-aa13-c3cd5b23747d-kube-api-access-kz56n\") pod \"27e60858-52de-4a1a-aa13-c3cd5b23747d\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.681871 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-utilities\") pod \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.681898 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsktw\" (UniqueName: \"kubernetes.io/projected/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-kube-api-access-lsktw\") pod \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.681932 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-catalog-content\") pod \"84310366-fec4-4521-a296-7fdba4b65821\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.681997 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jppnl\" (UniqueName: \"kubernetes.io/projected/84310366-fec4-4521-a296-7fdba4b65821-kube-api-access-jppnl\") pod \"84310366-fec4-4521-a296-7fdba4b65821\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.682018 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-utilities\") pod \"84310366-fec4-4521-a296-7fdba4b65821\" (UID: \"84310366-fec4-4521-a296-7fdba4b65821\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.682033 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-catalog-content\") pod \"27e60858-52de-4a1a-aa13-c3cd5b23747d\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.682048 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-utilities\") pod \"27e60858-52de-4a1a-aa13-c3cd5b23747d\" (UID: \"27e60858-52de-4a1a-aa13-c3cd5b23747d\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.682078 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-utilities\") pod \"8e102df5-64d4-4682-9cb1-22a7165f4294\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.682102 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nwsl\" (UniqueName: \"kubernetes.io/projected/8e102df5-64d4-4682-9cb1-22a7165f4294-kube-api-access-8nwsl\") pod \"8e102df5-64d4-4682-9cb1-22a7165f4294\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.682125 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-catalog-content\") pod \"8e102df5-64d4-4682-9cb1-22a7165f4294\" (UID: \"8e102df5-64d4-4682-9cb1-22a7165f4294\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.682142 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-catalog-content\") pod \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\" (UID: \"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.687080 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-utilities" (OuterVolumeSpecName: "utilities") pod "6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" (UID: "6bd30bed-dc7b-40f4-8e33-3289b8ec42ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.687821 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-utilities" (OuterVolumeSpecName: "utilities") pod "27e60858-52de-4a1a-aa13-c3cd5b23747d" (UID: "27e60858-52de-4a1a-aa13-c3cd5b23747d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.691779 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-utilities" (OuterVolumeSpecName: "utilities") pod "8e102df5-64d4-4682-9cb1-22a7165f4294" (UID: "8e102df5-64d4-4682-9cb1-22a7165f4294"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.698166 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-utilities" (OuterVolumeSpecName: "utilities") pod "84310366-fec4-4521-a296-7fdba4b65821" (UID: "84310366-fec4-4521-a296-7fdba4b65821"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.701404 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.707179 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-kube-api-access-lsktw" (OuterVolumeSpecName: "kube-api-access-lsktw") pod "6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" (UID: "6bd30bed-dc7b-40f4-8e33-3289b8ec42ac"). InnerVolumeSpecName "kube-api-access-lsktw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.715759 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e102df5-64d4-4682-9cb1-22a7165f4294-kube-api-access-8nwsl" (OuterVolumeSpecName: "kube-api-access-8nwsl") pod "8e102df5-64d4-4682-9cb1-22a7165f4294" (UID: "8e102df5-64d4-4682-9cb1-22a7165f4294"). InnerVolumeSpecName "kube-api-access-8nwsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.716027 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e102df5-64d4-4682-9cb1-22a7165f4294" (UID: "8e102df5-64d4-4682-9cb1-22a7165f4294"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.717516 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84310366-fec4-4521-a296-7fdba4b65821-kube-api-access-jppnl" (OuterVolumeSpecName: "kube-api-access-jppnl") pod "84310366-fec4-4521-a296-7fdba4b65821" (UID: "84310366-fec4-4521-a296-7fdba4b65821"). InnerVolumeSpecName "kube-api-access-jppnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.719468 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e60858-52de-4a1a-aa13-c3cd5b23747d-kube-api-access-kz56n" (OuterVolumeSpecName: "kube-api-access-kz56n") pod "27e60858-52de-4a1a-aa13-c3cd5b23747d" (UID: "27e60858-52de-4a1a-aa13-c3cd5b23747d"). InnerVolumeSpecName "kube-api-access-kz56n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.723981 4892 generic.go:334] "Generic (PLEG): container finished" podID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerID="fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1" exitCode=0 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.724047 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf27c" event={"ID":"8e102df5-64d4-4682-9cb1-22a7165f4294","Type":"ContainerDied","Data":"fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.724078 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cf27c" event={"ID":"8e102df5-64d4-4682-9cb1-22a7165f4294","Type":"ContainerDied","Data":"291b2ce22f180c439017d6fcc8a89d51bf5deae515db7895c8bcffab8382f139"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.724097 4892 scope.go:117] "RemoveContainer" containerID="fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.724249 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cf27c" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.759205 4892 generic.go:334] "Generic (PLEG): container finished" podID="84310366-fec4-4521-a296-7fdba4b65821" containerID="3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec" exitCode=0 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.759314 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44kmf" event={"ID":"84310366-fec4-4521-a296-7fdba4b65821","Type":"ContainerDied","Data":"3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.759361 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44kmf" event={"ID":"84310366-fec4-4521-a296-7fdba4b65821","Type":"ContainerDied","Data":"a331f0fd0480227b56592efd79d6fbf07421d926d8cedb9fd0fa7f67872ce43e"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.759451 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44kmf" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.788724 4892 scope.go:117] "RemoveContainer" containerID="e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789103 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-operator-metrics\") pod \"aac1222e-f92a-4345-8ca2-125d2d2c2627\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789188 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-trusted-ca\") pod \"aac1222e-f92a-4345-8ca2-125d2d2c2627\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789250 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9kds\" (UniqueName: \"kubernetes.io/projected/aac1222e-f92a-4345-8ca2-125d2d2c2627-kube-api-access-k9kds\") pod \"aac1222e-f92a-4345-8ca2-125d2d2c2627\" (UID: \"aac1222e-f92a-4345-8ca2-125d2d2c2627\") " Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789266 4892 generic.go:334] "Generic (PLEG): container finished" podID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerID="d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce" exitCode=0 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789355 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zd5m" event={"ID":"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac","Type":"ContainerDied","Data":"d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789382 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2zd5m" event={"ID":"6bd30bed-dc7b-40f4-8e33-3289b8ec42ac","Type":"ContainerDied","Data":"cec97df218026ff0993cd7038867162ee0f9559cad04e0ca336961f84baf0511"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789434 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jppnl\" (UniqueName: \"kubernetes.io/projected/84310366-fec4-4521-a296-7fdba4b65821-kube-api-access-jppnl\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789446 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789456 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789465 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789475 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nwsl\" (UniqueName: \"kubernetes.io/projected/8e102df5-64d4-4682-9cb1-22a7165f4294-kube-api-access-8nwsl\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789483 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e102df5-64d4-4682-9cb1-22a7165f4294-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789492 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz56n\" (UniqueName: \"kubernetes.io/projected/27e60858-52de-4a1a-aa13-c3cd5b23747d-kube-api-access-kz56n\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789499 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789507 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsktw\" (UniqueName: \"kubernetes.io/projected/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-kube-api-access-lsktw\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.789543 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2zd5m" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.807673 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aac1222e-f92a-4345-8ca2-125d2d2c2627-kube-api-access-k9kds" (OuterVolumeSpecName: "kube-api-access-k9kds") pod "aac1222e-f92a-4345-8ca2-125d2d2c2627" (UID: "aac1222e-f92a-4345-8ca2-125d2d2c2627"). InnerVolumeSpecName "kube-api-access-k9kds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.835589 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "aac1222e-f92a-4345-8ca2-125d2d2c2627" (UID: "aac1222e-f92a-4345-8ca2-125d2d2c2627"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.837083 4892 scope.go:117] "RemoveContainer" containerID="c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.837863 4892 generic.go:334] "Generic (PLEG): container finished" podID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerID="afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b" exitCode=0 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.838184 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s2rt" event={"ID":"27e60858-52de-4a1a-aa13-c3cd5b23747d","Type":"ContainerDied","Data":"afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.838209 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8s2rt" event={"ID":"27e60858-52de-4a1a-aa13-c3cd5b23747d","Type":"ContainerDied","Data":"397937973ce46928b3a3997fbe66e84d36c1755d7df1b3cd24b1d4eda06603e8"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.838600 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8s2rt" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.846455 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf27c"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.855250 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "aac1222e-f92a-4345-8ca2-125d2d2c2627" (UID: "aac1222e-f92a-4345-8ca2-125d2d2c2627"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.861297 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cf27c"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.873250 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84310366-fec4-4521-a296-7fdba4b65821" (UID: "84310366-fec4-4521-a296-7fdba4b65821"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.876576 4892 generic.go:334] "Generic (PLEG): container finished" podID="aac1222e-f92a-4345-8ca2-125d2d2c2627" containerID="6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf" exitCode=0 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.876617 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" event={"ID":"aac1222e-f92a-4345-8ca2-125d2d2c2627","Type":"ContainerDied","Data":"6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.876643 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" event={"ID":"aac1222e-f92a-4345-8ca2-125d2d2c2627","Type":"ContainerDied","Data":"3077d4497f24bb3b7f9ab8599ab5676a90ddd14142211a62d5e15a0f61fa361c"} Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.876724 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gj4rs" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.879922 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" (UID: "6bd30bed-dc7b-40f4-8e33-3289b8ec42ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.890356 4892 scope.go:117] "RemoveContainer" containerID="fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1" Oct 06 12:13:03 crc kubenswrapper[4892]: E1006 12:13:03.893606 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1\": container with ID starting with fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1 not found: ID does not exist" containerID="fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.893657 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1"} err="failed to get container status \"fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1\": rpc error: code = NotFound desc = could not find container \"fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1\": container with ID starting with fc5d213291a1fc8bd1e2160b895eb2bd9f75b042bc4e773fe0cce986b00048c1 not found: ID does not exist" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.893691 4892 scope.go:117] "RemoveContainer" containerID="e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4" Oct 06 12:13:03 crc kubenswrapper[4892]: E1006 12:13:03.894613 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4\": container with ID starting with e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4 not found: ID does not exist" containerID="e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.894858 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4"} err="failed to get container status \"e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4\": rpc error: code = NotFound desc = could not find container \"e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4\": container with ID starting with e53d18255b180b9ed77e9a5873ae9203ce46349c267e21056ff79ce260bd7ad4 not found: ID does not exist" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.894886 4892 scope.go:117] "RemoveContainer" containerID="c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.895956 4892 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.895981 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9kds\" (UniqueName: \"kubernetes.io/projected/aac1222e-f92a-4345-8ca2-125d2d2c2627-kube-api-access-k9kds\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.895992 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.896001 4892 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/aac1222e-f92a-4345-8ca2-125d2d2c2627-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.896011 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84310366-fec4-4521-a296-7fdba4b65821-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:03 crc kubenswrapper[4892]: E1006 12:13:03.896882 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458\": container with ID starting with c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458 not found: ID does not exist" containerID="c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.896908 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458"} err="failed to get container status \"c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458\": rpc error: code = NotFound desc = could not find container \"c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458\": container with ID starting with c17e37e042783ba0c51393f5e0814e2e0b04467558ac9c6e1e1a673f5af9a458 not found: ID does not exist" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.896927 4892 scope.go:117] "RemoveContainer" containerID="3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.923192 4892 scope.go:117] "RemoveContainer" containerID="e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.923674 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4rs"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.928388 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gj4rs"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.931268 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bdjg5"] Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.946555 4892 scope.go:117] "RemoveContainer" containerID="b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108" Oct 06 12:13:03 crc kubenswrapper[4892]: W1006 12:13:03.950392 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf6c1edd_eb9c_4b9d_a557_9dfa585c8a8a.slice/crio-276b5d4b79d04074890a98595d18d13e0b71bc1371666f214a8b0b7a33de1ae2 WatchSource:0}: Error finding container 276b5d4b79d04074890a98595d18d13e0b71bc1371666f214a8b0b7a33de1ae2: Status 404 returned error can't find the container with id 276b5d4b79d04074890a98595d18d13e0b71bc1371666f214a8b0b7a33de1ae2 Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.951737 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27e60858-52de-4a1a-aa13-c3cd5b23747d" (UID: "27e60858-52de-4a1a-aa13-c3cd5b23747d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.960617 4892 scope.go:117] "RemoveContainer" containerID="3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec" Oct 06 12:13:03 crc kubenswrapper[4892]: E1006 12:13:03.961924 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec\": container with ID starting with 3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec not found: ID does not exist" containerID="3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.961965 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec"} err="failed to get container status \"3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec\": rpc error: code = NotFound desc = could not find container \"3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec\": container with ID starting with 3018cbf2349ae92db17e151a824cd18b314368327f76366da22cb5271bf133ec not found: ID does not exist" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.961992 4892 scope.go:117] "RemoveContainer" containerID="e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b" Oct 06 12:13:03 crc kubenswrapper[4892]: E1006 12:13:03.962855 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b\": container with ID starting with e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b not found: ID does not exist" containerID="e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.962873 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b"} err="failed to get container status \"e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b\": rpc error: code = NotFound desc = could not find container \"e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b\": container with ID starting with e87d3282164444184a7dd8c45bcb204e4d00c16b15c723612ba31a391cd2c14b not found: ID does not exist" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.962886 4892 scope.go:117] "RemoveContainer" containerID="b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108" Oct 06 12:13:03 crc kubenswrapper[4892]: E1006 12:13:03.964414 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108\": container with ID starting with b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108 not found: ID does not exist" containerID="b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.964440 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108"} err="failed to get container status \"b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108\": rpc error: code = NotFound desc = could not find container \"b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108\": container with ID starting with b800c9af030f32aa11dc5f2bb9721343682bfeab8453790dc74c35800b491108 not found: ID does not exist" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.964453 4892 scope.go:117] "RemoveContainer" containerID="d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.980029 4892 scope.go:117] "RemoveContainer" containerID="c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.995797 4892 scope.go:117] "RemoveContainer" containerID="ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b" Oct 06 12:13:03 crc kubenswrapper[4892]: I1006 12:13:03.996486 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27e60858-52de-4a1a-aa13-c3cd5b23747d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.027287 4892 scope.go:117] "RemoveContainer" containerID="d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce" Oct 06 12:13:04 crc kubenswrapper[4892]: E1006 12:13:04.027706 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce\": container with ID starting with d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce not found: ID does not exist" containerID="d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.027761 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce"} err="failed to get container status \"d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce\": rpc error: code = NotFound desc = could not find container \"d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce\": container with ID starting with d425094f6823a3535bd94fa222e50343cb1a40ec2a1b6c26d38e59adc25b01ce not found: ID does not exist" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.027794 4892 scope.go:117] "RemoveContainer" containerID="c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c" Oct 06 12:13:04 crc kubenswrapper[4892]: E1006 12:13:04.028147 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c\": container with ID starting with c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c not found: ID does not exist" containerID="c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.028185 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c"} err="failed to get container status \"c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c\": rpc error: code = NotFound desc = could not find container \"c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c\": container with ID starting with c5e54fdde971838f3f4a93ba139a09dcf4e30ca3ed920ce7ad5e9520582cd09c not found: ID does not exist" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.028213 4892 scope.go:117] "RemoveContainer" containerID="ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b" Oct 06 12:13:04 crc kubenswrapper[4892]: E1006 12:13:04.028490 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b\": container with ID starting with ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b not found: ID does not exist" containerID="ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.028509 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b"} err="failed to get container status \"ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b\": rpc error: code = NotFound desc = could not find container \"ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b\": container with ID starting with ce9e4168cafb5bae4b0e2dcb99353c85e0630e32d7de010e12d7bb6b631d774b not found: ID does not exist" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.028521 4892 scope.go:117] "RemoveContainer" containerID="afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.047934 4892 scope.go:117] "RemoveContainer" containerID="cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.062840 4892 scope.go:117] "RemoveContainer" containerID="472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.079639 4892 scope.go:117] "RemoveContainer" containerID="afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b" Oct 06 12:13:04 crc kubenswrapper[4892]: E1006 12:13:04.080724 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b\": container with ID starting with afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b not found: ID does not exist" containerID="afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.080752 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b"} err="failed to get container status \"afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b\": rpc error: code = NotFound desc = could not find container \"afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b\": container with ID starting with afc1ee19ed67c385926c0f6fa21c9ec04f3b0827f83cae3cb0f3627a8d309e1b not found: ID does not exist" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.080772 4892 scope.go:117] "RemoveContainer" containerID="cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.082816 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44kmf"] Oct 06 12:13:04 crc kubenswrapper[4892]: E1006 12:13:04.082989 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323\": container with ID starting with cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323 not found: ID does not exist" containerID="cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.083067 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323"} err="failed to get container status \"cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323\": rpc error: code = NotFound desc = could not find container \"cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323\": container with ID starting with cf920e31ab00c6162430ad96078c81bab826402747fd75e08a06a7935fbd9323 not found: ID does not exist" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.083155 4892 scope.go:117] "RemoveContainer" containerID="472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460" Oct 06 12:13:04 crc kubenswrapper[4892]: E1006 12:13:04.085654 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460\": container with ID starting with 472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460 not found: ID does not exist" containerID="472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.085689 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460"} err="failed to get container status \"472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460\": rpc error: code = NotFound desc = could not find container \"472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460\": container with ID starting with 472073b9336fd1838d7fa2382eaba6d3df36c57d655cfd020df9bb28586f0460 not found: ID does not exist" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.085713 4892 scope.go:117] "RemoveContainer" containerID="6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.085919 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-44kmf"] Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.100239 4892 scope.go:117] "RemoveContainer" containerID="6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf" Oct 06 12:13:04 crc kubenswrapper[4892]: E1006 12:13:04.100619 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf\": container with ID starting with 6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf not found: ID does not exist" containerID="6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.100645 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf"} err="failed to get container status \"6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf\": rpc error: code = NotFound desc = could not find container \"6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf\": container with ID starting with 6913e5efd0000a7e2095fcce8b94d5616796068d1bb700a6574a1fc2debb78cf not found: ID does not exist" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.120080 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2zd5m"] Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.122348 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2zd5m"] Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.159146 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8s2rt"] Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.161769 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8s2rt"] Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.174615 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" path="/var/lib/kubelet/pods/27e60858-52de-4a1a-aa13-c3cd5b23747d/volumes" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.175264 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" path="/var/lib/kubelet/pods/6bd30bed-dc7b-40f4-8e33-3289b8ec42ac/volumes" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.175979 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84310366-fec4-4521-a296-7fdba4b65821" path="/var/lib/kubelet/pods/84310366-fec4-4521-a296-7fdba4b65821/volumes" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.177046 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" path="/var/lib/kubelet/pods/8e102df5-64d4-4682-9cb1-22a7165f4294/volumes" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.177716 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aac1222e-f92a-4345-8ca2-125d2d2c2627" path="/var/lib/kubelet/pods/aac1222e-f92a-4345-8ca2-125d2d2c2627/volumes" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.882584 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" event={"ID":"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a","Type":"ContainerStarted","Data":"abfd1e55483863b05bcf823e7f7b18e88b51ce609001aca43bb4f41b51ef4689"} Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.882882 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.882895 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" event={"ID":"df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a","Type":"ContainerStarted","Data":"276b5d4b79d04074890a98595d18d13e0b71bc1371666f214a8b0b7a33de1ae2"} Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.887386 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" Oct 06 12:13:04 crc kubenswrapper[4892]: I1006 12:13:04.923204 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bdjg5" podStartSLOduration=1.923187985 podStartE2EDuration="1.923187985s" podCreationTimestamp="2025-10-06 12:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:13:04.904629673 +0000 UTC m=+271.454335438" watchObservedRunningTime="2025-10-06 12:13:04.923187985 +0000 UTC m=+271.472893740" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.406753 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j2hm7"] Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.406954 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerName="extract-utilities" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.406968 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerName="extract-utilities" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.406978 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerName="extract-content" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.406986 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerName="extract-content" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.406998 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407004 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407014 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84310366-fec4-4521-a296-7fdba4b65821" containerName="extract-content" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407020 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="84310366-fec4-4521-a296-7fdba4b65821" containerName="extract-content" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407032 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84310366-fec4-4521-a296-7fdba4b65821" containerName="extract-utilities" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407037 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="84310366-fec4-4521-a296-7fdba4b65821" containerName="extract-utilities" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407048 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aac1222e-f92a-4345-8ca2-125d2d2c2627" containerName="marketplace-operator" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407054 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="aac1222e-f92a-4345-8ca2-125d2d2c2627" containerName="marketplace-operator" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407061 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="extract-utilities" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407066 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="extract-utilities" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407073 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84310366-fec4-4521-a296-7fdba4b65821" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407079 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="84310366-fec4-4521-a296-7fdba4b65821" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407088 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407093 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407100 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407106 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407114 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="extract-content" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407119 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="extract-content" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407126 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerName="extract-content" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407131 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerName="extract-content" Oct 06 12:13:05 crc kubenswrapper[4892]: E1006 12:13:05.407140 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerName="extract-utilities" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407146 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerName="extract-utilities" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407227 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e60858-52de-4a1a-aa13-c3cd5b23747d" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407238 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="aac1222e-f92a-4345-8ca2-125d2d2c2627" containerName="marketplace-operator" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407248 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="84310366-fec4-4521-a296-7fdba4b65821" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407255 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e102df5-64d4-4682-9cb1-22a7165f4294" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407263 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd30bed-dc7b-40f4-8e33-3289b8ec42ac" containerName="registry-server" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.407985 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.411575 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.424294 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2hm7"] Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.514421 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bed95b1-6340-43d9-88be-140829a9a0ab-utilities\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.514506 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bed95b1-6340-43d9-88be-140829a9a0ab-catalog-content\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.514589 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jn5\" (UniqueName: \"kubernetes.io/projected/1bed95b1-6340-43d9-88be-140829a9a0ab-kube-api-access-g7jn5\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.612309 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8n2m6"] Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.613298 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.615772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bed95b1-6340-43d9-88be-140829a9a0ab-utilities\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.615826 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bed95b1-6340-43d9-88be-140829a9a0ab-catalog-content\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.615880 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jn5\" (UniqueName: \"kubernetes.io/projected/1bed95b1-6340-43d9-88be-140829a9a0ab-kube-api-access-g7jn5\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.616541 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bed95b1-6340-43d9-88be-140829a9a0ab-utilities\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.616805 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bed95b1-6340-43d9-88be-140829a9a0ab-catalog-content\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.618453 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.620347 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8n2m6"] Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.639656 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jn5\" (UniqueName: \"kubernetes.io/projected/1bed95b1-6340-43d9-88be-140829a9a0ab-kube-api-access-g7jn5\") pod \"redhat-marketplace-j2hm7\" (UID: \"1bed95b1-6340-43d9-88be-140829a9a0ab\") " pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.717085 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbp75\" (UniqueName: \"kubernetes.io/projected/55ef17bc-6b08-450b-947d-1e3c5eb5f806-kube-api-access-hbp75\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.717135 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-utilities\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.717189 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-catalog-content\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.737581 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.818569 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbp75\" (UniqueName: \"kubernetes.io/projected/55ef17bc-6b08-450b-947d-1e3c5eb5f806-kube-api-access-hbp75\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.818625 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-utilities\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.818647 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-catalog-content\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.819227 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-catalog-content\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.819741 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-utilities\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.839746 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbp75\" (UniqueName: \"kubernetes.io/projected/55ef17bc-6b08-450b-947d-1e3c5eb5f806-kube-api-access-hbp75\") pod \"redhat-operators-8n2m6\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.924469 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j2hm7"] Oct 06 12:13:05 crc kubenswrapper[4892]: I1006 12:13:05.963242 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:06 crc kubenswrapper[4892]: I1006 12:13:06.136665 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8n2m6"] Oct 06 12:13:06 crc kubenswrapper[4892]: W1006 12:13:06.153702 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ef17bc_6b08_450b_947d_1e3c5eb5f806.slice/crio-f4e8ac7f20926177433e7110fbe7b6b2bcedd0d308b28a19d29c1bbfc454e7b5 WatchSource:0}: Error finding container f4e8ac7f20926177433e7110fbe7b6b2bcedd0d308b28a19d29c1bbfc454e7b5: Status 404 returned error can't find the container with id f4e8ac7f20926177433e7110fbe7b6b2bcedd0d308b28a19d29c1bbfc454e7b5 Oct 06 12:13:06 crc kubenswrapper[4892]: I1006 12:13:06.899187 4892 generic.go:334] "Generic (PLEG): container finished" podID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerID="fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796" exitCode=0 Oct 06 12:13:06 crc kubenswrapper[4892]: I1006 12:13:06.899263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n2m6" event={"ID":"55ef17bc-6b08-450b-947d-1e3c5eb5f806","Type":"ContainerDied","Data":"fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796"} Oct 06 12:13:06 crc kubenswrapper[4892]: I1006 12:13:06.899290 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n2m6" event={"ID":"55ef17bc-6b08-450b-947d-1e3c5eb5f806","Type":"ContainerStarted","Data":"f4e8ac7f20926177433e7110fbe7b6b2bcedd0d308b28a19d29c1bbfc454e7b5"} Oct 06 12:13:06 crc kubenswrapper[4892]: I1006 12:13:06.900604 4892 generic.go:334] "Generic (PLEG): container finished" podID="1bed95b1-6340-43d9-88be-140829a9a0ab" containerID="59e9cf1e98910d93c6638bd3dfed4d034948c45a203646688e5081d532f3ac41" exitCode=0 Oct 06 12:13:06 crc kubenswrapper[4892]: I1006 12:13:06.900662 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2hm7" event={"ID":"1bed95b1-6340-43d9-88be-140829a9a0ab","Type":"ContainerDied","Data":"59e9cf1e98910d93c6638bd3dfed4d034948c45a203646688e5081d532f3ac41"} Oct 06 12:13:06 crc kubenswrapper[4892]: I1006 12:13:06.900902 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2hm7" event={"ID":"1bed95b1-6340-43d9-88be-140829a9a0ab","Type":"ContainerStarted","Data":"caed005c84e60f00de5f4bcba3f7a91482e3c05712bd1f5f290e8cc41a38cfb1"} Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.798867 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q8fjn"] Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.800823 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.804955 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.831988 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q8fjn"] Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.848120 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4k92\" (UniqueName: \"kubernetes.io/projected/44a5fd23-14c3-4215-a22c-9111e7a1c591-kube-api-access-l4k92\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.848177 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a5fd23-14c3-4215-a22c-9111e7a1c591-catalog-content\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.848202 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a5fd23-14c3-4215-a22c-9111e7a1c591-utilities\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.909263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2hm7" event={"ID":"1bed95b1-6340-43d9-88be-140829a9a0ab","Type":"ContainerStarted","Data":"e09917b41fefec336ae496fa6872dd2aac4f03d30db0311127f2eccad2046d3b"} Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.949604 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4k92\" (UniqueName: \"kubernetes.io/projected/44a5fd23-14c3-4215-a22c-9111e7a1c591-kube-api-access-l4k92\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.949670 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a5fd23-14c3-4215-a22c-9111e7a1c591-catalog-content\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.949709 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a5fd23-14c3-4215-a22c-9111e7a1c591-utilities\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.950385 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44a5fd23-14c3-4215-a22c-9111e7a1c591-utilities\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.950706 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44a5fd23-14c3-4215-a22c-9111e7a1c591-catalog-content\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.984019 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4k92\" (UniqueName: \"kubernetes.io/projected/44a5fd23-14c3-4215-a22c-9111e7a1c591-kube-api-access-l4k92\") pod \"certified-operators-q8fjn\" (UID: \"44a5fd23-14c3-4215-a22c-9111e7a1c591\") " pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.998247 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kslsb"] Oct 06 12:13:07 crc kubenswrapper[4892]: I1006 12:13:07.999110 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.002199 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.011293 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kslsb"] Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.050890 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-utilities\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.050942 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-catalog-content\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.051144 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mpg\" (UniqueName: \"kubernetes.io/projected/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-kube-api-access-88mpg\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.128146 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.152304 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-catalog-content\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.152478 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mpg\" (UniqueName: \"kubernetes.io/projected/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-kube-api-access-88mpg\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.152570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-utilities\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.153225 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-utilities\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.153186 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-catalog-content\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.177619 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mpg\" (UniqueName: \"kubernetes.io/projected/cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3-kube-api-access-88mpg\") pod \"community-operators-kslsb\" (UID: \"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3\") " pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.343250 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q8fjn"] Oct 06 12:13:08 crc kubenswrapper[4892]: W1006 12:13:08.372960 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a5fd23_14c3_4215_a22c_9111e7a1c591.slice/crio-05e7326cbd409f3ea1428e3f67cfddce6e12f4f1e0bd9cc406d24beb42bc6ac8 WatchSource:0}: Error finding container 05e7326cbd409f3ea1428e3f67cfddce6e12f4f1e0bd9cc406d24beb42bc6ac8: Status 404 returned error can't find the container with id 05e7326cbd409f3ea1428e3f67cfddce6e12f4f1e0bd9cc406d24beb42bc6ac8 Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.387350 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.608511 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kslsb"] Oct 06 12:13:08 crc kubenswrapper[4892]: W1006 12:13:08.628943 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd5f0e41_d114_4ef9_a0ab_ad3dbf5cd8c3.slice/crio-44883bb311ef39e0f9a0f7aad72e06e173c0acd07c962eca447e89735ee665ec WatchSource:0}: Error finding container 44883bb311ef39e0f9a0f7aad72e06e173c0acd07c962eca447e89735ee665ec: Status 404 returned error can't find the container with id 44883bb311ef39e0f9a0f7aad72e06e173c0acd07c962eca447e89735ee665ec Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.916770 4892 generic.go:334] "Generic (PLEG): container finished" podID="44a5fd23-14c3-4215-a22c-9111e7a1c591" containerID="ae680346922f1b222231a34fc2dbc948356eec526c24fe94483323206db60df3" exitCode=0 Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.916813 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8fjn" event={"ID":"44a5fd23-14c3-4215-a22c-9111e7a1c591","Type":"ContainerDied","Data":"ae680346922f1b222231a34fc2dbc948356eec526c24fe94483323206db60df3"} Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.917129 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8fjn" event={"ID":"44a5fd23-14c3-4215-a22c-9111e7a1c591","Type":"ContainerStarted","Data":"05e7326cbd409f3ea1428e3f67cfddce6e12f4f1e0bd9cc406d24beb42bc6ac8"} Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.920578 4892 generic.go:334] "Generic (PLEG): container finished" podID="cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3" containerID="14cac9c29f48bc1d7abcff3c9a0d206f3ef1a91cc3e5394f28c8fc58edd804d1" exitCode=0 Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.920704 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kslsb" event={"ID":"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3","Type":"ContainerDied","Data":"14cac9c29f48bc1d7abcff3c9a0d206f3ef1a91cc3e5394f28c8fc58edd804d1"} Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.920739 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kslsb" event={"ID":"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3","Type":"ContainerStarted","Data":"44883bb311ef39e0f9a0f7aad72e06e173c0acd07c962eca447e89735ee665ec"} Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.923767 4892 generic.go:334] "Generic (PLEG): container finished" podID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerID="48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e" exitCode=0 Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.923818 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n2m6" event={"ID":"55ef17bc-6b08-450b-947d-1e3c5eb5f806","Type":"ContainerDied","Data":"48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e"} Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.926588 4892 generic.go:334] "Generic (PLEG): container finished" podID="1bed95b1-6340-43d9-88be-140829a9a0ab" containerID="e09917b41fefec336ae496fa6872dd2aac4f03d30db0311127f2eccad2046d3b" exitCode=0 Oct 06 12:13:08 crc kubenswrapper[4892]: I1006 12:13:08.926666 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2hm7" event={"ID":"1bed95b1-6340-43d9-88be-140829a9a0ab","Type":"ContainerDied","Data":"e09917b41fefec336ae496fa6872dd2aac4f03d30db0311127f2eccad2046d3b"} Oct 06 12:13:09 crc kubenswrapper[4892]: I1006 12:13:09.937044 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j2hm7" event={"ID":"1bed95b1-6340-43d9-88be-140829a9a0ab","Type":"ContainerStarted","Data":"e2cafa617bd94e177dd89411e53642f155c941b67a951d732213311b1da6dcb4"} Oct 06 12:13:09 crc kubenswrapper[4892]: I1006 12:13:09.939090 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kslsb" event={"ID":"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3","Type":"ContainerStarted","Data":"b3ef069cadded0511a3ccc71b21b63ae9c722f6774288a30e6cfe4110fa9d2d5"} Oct 06 12:13:09 crc kubenswrapper[4892]: I1006 12:13:09.942302 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n2m6" event={"ID":"55ef17bc-6b08-450b-947d-1e3c5eb5f806","Type":"ContainerStarted","Data":"9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609"} Oct 06 12:13:09 crc kubenswrapper[4892]: I1006 12:13:09.955685 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j2hm7" podStartSLOduration=2.493599171 podStartE2EDuration="4.955666475s" podCreationTimestamp="2025-10-06 12:13:05 +0000 UTC" firstStartedPulling="2025-10-06 12:13:06.901951186 +0000 UTC m=+273.451656961" lastFinishedPulling="2025-10-06 12:13:09.3640185 +0000 UTC m=+275.913724265" observedRunningTime="2025-10-06 12:13:09.952647055 +0000 UTC m=+276.502352820" watchObservedRunningTime="2025-10-06 12:13:09.955666475 +0000 UTC m=+276.505372240" Oct 06 12:13:09 crc kubenswrapper[4892]: I1006 12:13:09.971291 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8n2m6" podStartSLOduration=2.37044267 podStartE2EDuration="4.971259038s" podCreationTimestamp="2025-10-06 12:13:05 +0000 UTC" firstStartedPulling="2025-10-06 12:13:06.901818572 +0000 UTC m=+273.451524337" lastFinishedPulling="2025-10-06 12:13:09.50263493 +0000 UTC m=+276.052340705" observedRunningTime="2025-10-06 12:13:09.969568308 +0000 UTC m=+276.519274083" watchObservedRunningTime="2025-10-06 12:13:09.971259038 +0000 UTC m=+276.520964813" Oct 06 12:13:10 crc kubenswrapper[4892]: I1006 12:13:10.951237 4892 generic.go:334] "Generic (PLEG): container finished" podID="44a5fd23-14c3-4215-a22c-9111e7a1c591" containerID="6122b4a629e5897365060fd4bc16c5b702e8ccf43e519a2ba3dfeaf35b6348bb" exitCode=0 Oct 06 12:13:10 crc kubenswrapper[4892]: I1006 12:13:10.951319 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8fjn" event={"ID":"44a5fd23-14c3-4215-a22c-9111e7a1c591","Type":"ContainerDied","Data":"6122b4a629e5897365060fd4bc16c5b702e8ccf43e519a2ba3dfeaf35b6348bb"} Oct 06 12:13:10 crc kubenswrapper[4892]: I1006 12:13:10.953888 4892 generic.go:334] "Generic (PLEG): container finished" podID="cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3" containerID="b3ef069cadded0511a3ccc71b21b63ae9c722f6774288a30e6cfe4110fa9d2d5" exitCode=0 Oct 06 12:13:10 crc kubenswrapper[4892]: I1006 12:13:10.953954 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kslsb" event={"ID":"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3","Type":"ContainerDied","Data":"b3ef069cadded0511a3ccc71b21b63ae9c722f6774288a30e6cfe4110fa9d2d5"} Oct 06 12:13:11 crc kubenswrapper[4892]: I1006 12:13:11.960348 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kslsb" event={"ID":"cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3","Type":"ContainerStarted","Data":"c9201697a92a0169b646bbf8ed6f2682eaeb40c5434666e58adf513fd935dc1f"} Oct 06 12:13:11 crc kubenswrapper[4892]: I1006 12:13:11.962718 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8fjn" event={"ID":"44a5fd23-14c3-4215-a22c-9111e7a1c591","Type":"ContainerStarted","Data":"2eda8299c38c08b7a995f21f6e5b1087a6ff68449053268c8e909e2ffae866d1"} Oct 06 12:13:11 crc kubenswrapper[4892]: I1006 12:13:11.975839 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kslsb" podStartSLOduration=2.516065329 podStartE2EDuration="4.975824536s" podCreationTimestamp="2025-10-06 12:13:07 +0000 UTC" firstStartedPulling="2025-10-06 12:13:08.924188298 +0000 UTC m=+275.473894063" lastFinishedPulling="2025-10-06 12:13:11.383947485 +0000 UTC m=+277.933653270" observedRunningTime="2025-10-06 12:13:11.973686452 +0000 UTC m=+278.523392217" watchObservedRunningTime="2025-10-06 12:13:11.975824536 +0000 UTC m=+278.525530301" Oct 06 12:13:11 crc kubenswrapper[4892]: I1006 12:13:11.998840 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q8fjn" podStartSLOduration=2.449365527 podStartE2EDuration="4.998816859s" podCreationTimestamp="2025-10-06 12:13:07 +0000 UTC" firstStartedPulling="2025-10-06 12:13:08.918740646 +0000 UTC m=+275.468446411" lastFinishedPulling="2025-10-06 12:13:11.468191978 +0000 UTC m=+278.017897743" observedRunningTime="2025-10-06 12:13:11.994701087 +0000 UTC m=+278.544406842" watchObservedRunningTime="2025-10-06 12:13:11.998816859 +0000 UTC m=+278.548522624" Oct 06 12:13:15 crc kubenswrapper[4892]: I1006 12:13:15.738375 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:15 crc kubenswrapper[4892]: I1006 12:13:15.738625 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:15 crc kubenswrapper[4892]: I1006 12:13:15.807528 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:15 crc kubenswrapper[4892]: I1006 12:13:15.964137 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:15 crc kubenswrapper[4892]: I1006 12:13:15.964186 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:16 crc kubenswrapper[4892]: I1006 12:13:16.024578 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:16 crc kubenswrapper[4892]: I1006 12:13:16.037713 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j2hm7" Oct 06 12:13:16 crc kubenswrapper[4892]: I1006 12:13:16.080953 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 12:13:18 crc kubenswrapper[4892]: I1006 12:13:18.128499 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:18 crc kubenswrapper[4892]: I1006 12:13:18.128886 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:18 crc kubenswrapper[4892]: I1006 12:13:18.179219 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:13:18 crc kubenswrapper[4892]: I1006 12:13:18.388527 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:18 crc kubenswrapper[4892]: I1006 12:13:18.388604 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:18 crc kubenswrapper[4892]: I1006 12:13:18.460771 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:19 crc kubenswrapper[4892]: I1006 12:13:19.050775 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kslsb" Oct 06 12:13:19 crc kubenswrapper[4892]: I1006 12:13:19.068347 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q8fjn" Oct 06 12:14:52 crc kubenswrapper[4892]: I1006 12:14:52.985215 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:14:52 crc kubenswrapper[4892]: I1006 12:14:52.985920 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.152507 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62"] Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.155025 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.158604 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.160238 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.187637 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62"] Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.253519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slgdb\" (UniqueName: \"kubernetes.io/projected/4e968722-61cc-49e2-a817-981c8a48b4de-kube-api-access-slgdb\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.254694 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e968722-61cc-49e2-a817-981c8a48b4de-config-volume\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.254876 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e968722-61cc-49e2-a817-981c8a48b4de-secret-volume\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.356596 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e968722-61cc-49e2-a817-981c8a48b4de-secret-volume\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.356650 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slgdb\" (UniqueName: \"kubernetes.io/projected/4e968722-61cc-49e2-a817-981c8a48b4de-kube-api-access-slgdb\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.356722 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e968722-61cc-49e2-a817-981c8a48b4de-config-volume\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.357504 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e968722-61cc-49e2-a817-981c8a48b4de-config-volume\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.365411 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e968722-61cc-49e2-a817-981c8a48b4de-secret-volume\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.388147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slgdb\" (UniqueName: \"kubernetes.io/projected/4e968722-61cc-49e2-a817-981c8a48b4de-kube-api-access-slgdb\") pod \"collect-profiles-29329215-q2p62\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.484793 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:00 crc kubenswrapper[4892]: I1006 12:15:00.751970 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62"] Oct 06 12:15:01 crc kubenswrapper[4892]: I1006 12:15:01.708539 4892 generic.go:334] "Generic (PLEG): container finished" podID="4e968722-61cc-49e2-a817-981c8a48b4de" containerID="07b2d548f9b68e071ce29a00de8cafc935bcb05a12b3c95624fd3ffa8eb03f90" exitCode=0 Oct 06 12:15:01 crc kubenswrapper[4892]: I1006 12:15:01.708633 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" event={"ID":"4e968722-61cc-49e2-a817-981c8a48b4de","Type":"ContainerDied","Data":"07b2d548f9b68e071ce29a00de8cafc935bcb05a12b3c95624fd3ffa8eb03f90"} Oct 06 12:15:01 crc kubenswrapper[4892]: I1006 12:15:01.708972 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" event={"ID":"4e968722-61cc-49e2-a817-981c8a48b4de","Type":"ContainerStarted","Data":"c99508735936285cd4322de3a3550441980747315c2bbe38b9e086f7ec726aeb"} Oct 06 12:15:02 crc kubenswrapper[4892]: I1006 12:15:02.997166 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.099209 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slgdb\" (UniqueName: \"kubernetes.io/projected/4e968722-61cc-49e2-a817-981c8a48b4de-kube-api-access-slgdb\") pod \"4e968722-61cc-49e2-a817-981c8a48b4de\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.099378 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e968722-61cc-49e2-a817-981c8a48b4de-config-volume\") pod \"4e968722-61cc-49e2-a817-981c8a48b4de\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.099410 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e968722-61cc-49e2-a817-981c8a48b4de-secret-volume\") pod \"4e968722-61cc-49e2-a817-981c8a48b4de\" (UID: \"4e968722-61cc-49e2-a817-981c8a48b4de\") " Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.100141 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e968722-61cc-49e2-a817-981c8a48b4de-config-volume" (OuterVolumeSpecName: "config-volume") pod "4e968722-61cc-49e2-a817-981c8a48b4de" (UID: "4e968722-61cc-49e2-a817-981c8a48b4de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.105026 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e968722-61cc-49e2-a817-981c8a48b4de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4e968722-61cc-49e2-a817-981c8a48b4de" (UID: "4e968722-61cc-49e2-a817-981c8a48b4de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.105603 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e968722-61cc-49e2-a817-981c8a48b4de-kube-api-access-slgdb" (OuterVolumeSpecName: "kube-api-access-slgdb") pod "4e968722-61cc-49e2-a817-981c8a48b4de" (UID: "4e968722-61cc-49e2-a817-981c8a48b4de"). InnerVolumeSpecName "kube-api-access-slgdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.201269 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e968722-61cc-49e2-a817-981c8a48b4de-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.201318 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4e968722-61cc-49e2-a817-981c8a48b4de-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.201349 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slgdb\" (UniqueName: \"kubernetes.io/projected/4e968722-61cc-49e2-a817-981c8a48b4de-kube-api-access-slgdb\") on node \"crc\" DevicePath \"\"" Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.727073 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" event={"ID":"4e968722-61cc-49e2-a817-981c8a48b4de","Type":"ContainerDied","Data":"c99508735936285cd4322de3a3550441980747315c2bbe38b9e086f7ec726aeb"} Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.727129 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c99508735936285cd4322de3a3550441980747315c2bbe38b9e086f7ec726aeb" Oct 06 12:15:03 crc kubenswrapper[4892]: I1006 12:15:03.727142 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62" Oct 06 12:15:22 crc kubenswrapper[4892]: I1006 12:15:22.985061 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:15:22 crc kubenswrapper[4892]: I1006 12:15:22.985829 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:15:52 crc kubenswrapper[4892]: I1006 12:15:52.984787 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:15:52 crc kubenswrapper[4892]: I1006 12:15:52.986532 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:15:52 crc kubenswrapper[4892]: I1006 12:15:52.986625 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:15:53 crc kubenswrapper[4892]: I1006 12:15:53.060423 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe252163f7a2babdff1e10cc57f09f2bd93ebf81d17649e7aa36213a49197603"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:15:53 crc kubenswrapper[4892]: I1006 12:15:53.060540 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://fe252163f7a2babdff1e10cc57f09f2bd93ebf81d17649e7aa36213a49197603" gracePeriod=600 Oct 06 12:15:54 crc kubenswrapper[4892]: I1006 12:15:54.069036 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="fe252163f7a2babdff1e10cc57f09f2bd93ebf81d17649e7aa36213a49197603" exitCode=0 Oct 06 12:15:54 crc kubenswrapper[4892]: I1006 12:15:54.069151 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"fe252163f7a2babdff1e10cc57f09f2bd93ebf81d17649e7aa36213a49197603"} Oct 06 12:15:54 crc kubenswrapper[4892]: I1006 12:15:54.069558 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"e79f1d5ccf3f44bb99369047594e47c87713adc95a8cf367cf3aea501eded4f0"} Oct 06 12:15:54 crc kubenswrapper[4892]: I1006 12:15:54.069608 4892 scope.go:117] "RemoveContainer" containerID="f02c6c51a524d519cd2123ed4432d411c8c83262b223953711c5cdf19c885493" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.685729 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8h66n"] Oct 06 12:16:48 crc kubenswrapper[4892]: E1006 12:16:48.686689 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e968722-61cc-49e2-a817-981c8a48b4de" containerName="collect-profiles" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.686710 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e968722-61cc-49e2-a817-981c8a48b4de" containerName="collect-profiles" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.686869 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e968722-61cc-49e2-a817-981c8a48b4de" containerName="collect-profiles" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.687435 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.706051 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8h66n"] Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.888035 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-registry-tls\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.888106 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.888140 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sth8q\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-kube-api-access-sth8q\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.888166 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.888190 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.888277 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-registry-certificates\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.888366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-bound-sa-token\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.888469 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-trusted-ca\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.920253 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.990239 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-registry-tls\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.990317 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sth8q\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-kube-api-access-sth8q\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.990426 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.990461 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.990500 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-registry-certificates\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.990542 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-bound-sa-token\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.990607 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-trusted-ca\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.991615 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.993280 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-trusted-ca\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:48 crc kubenswrapper[4892]: I1006 12:16:48.993495 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-registry-certificates\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.006374 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-registry-tls\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.006378 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.013878 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-bound-sa-token\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.022727 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sth8q\" (UniqueName: \"kubernetes.io/projected/bf0b80ed-c516-45d3-bfdb-d53c7a8b6936-kube-api-access-sth8q\") pod \"image-registry-66df7c8f76-8h66n\" (UID: \"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936\") " pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.035472 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.274358 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8h66n"] Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.445515 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" event={"ID":"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936","Type":"ContainerStarted","Data":"98b36387d3587e27415da4f0f451527699ef28520d612666cb459655542b4b55"} Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.445631 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" event={"ID":"bf0b80ed-c516-45d3-bfdb-d53c7a8b6936","Type":"ContainerStarted","Data":"750a40de399d40c14148efe8534a089dd54bebe96f709105ccb58137531c8709"} Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.445844 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:16:49 crc kubenswrapper[4892]: I1006 12:16:49.472106 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" podStartSLOduration=1.47207735 podStartE2EDuration="1.47207735s" podCreationTimestamp="2025-10-06 12:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:16:49.4672416 +0000 UTC m=+496.016947425" watchObservedRunningTime="2025-10-06 12:16:49.47207735 +0000 UTC m=+496.021783155" Oct 06 12:17:09 crc kubenswrapper[4892]: I1006 12:17:09.042565 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8h66n" Oct 06 12:17:09 crc kubenswrapper[4892]: I1006 12:17:09.102883 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncchp"] Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.166184 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" podUID="1d15ec4b-09ec-427a-b002-a7293f363d8a" containerName="registry" containerID="cri-o://3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542" gracePeriod=30 Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.614916 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.709775 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-trusted-ca\") pod \"1d15ec4b-09ec-427a-b002-a7293f363d8a\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.709861 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d15ec4b-09ec-427a-b002-a7293f363d8a-ca-trust-extracted\") pod \"1d15ec4b-09ec-427a-b002-a7293f363d8a\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.710017 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"1d15ec4b-09ec-427a-b002-a7293f363d8a\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.710044 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-tls\") pod \"1d15ec4b-09ec-427a-b002-a7293f363d8a\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.710073 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-bound-sa-token\") pod \"1d15ec4b-09ec-427a-b002-a7293f363d8a\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.710148 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp774\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-kube-api-access-dp774\") pod \"1d15ec4b-09ec-427a-b002-a7293f363d8a\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.710174 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-certificates\") pod \"1d15ec4b-09ec-427a-b002-a7293f363d8a\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.710202 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d15ec4b-09ec-427a-b002-a7293f363d8a-installation-pull-secrets\") pod \"1d15ec4b-09ec-427a-b002-a7293f363d8a\" (UID: \"1d15ec4b-09ec-427a-b002-a7293f363d8a\") " Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.710765 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1d15ec4b-09ec-427a-b002-a7293f363d8a" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.712710 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1d15ec4b-09ec-427a-b002-a7293f363d8a" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.716445 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1d15ec4b-09ec-427a-b002-a7293f363d8a" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.716954 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1d15ec4b-09ec-427a-b002-a7293f363d8a" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.720022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d15ec4b-09ec-427a-b002-a7293f363d8a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1d15ec4b-09ec-427a-b002-a7293f363d8a" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.726771 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "1d15ec4b-09ec-427a-b002-a7293f363d8a" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.729133 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-kube-api-access-dp774" (OuterVolumeSpecName: "kube-api-access-dp774") pod "1d15ec4b-09ec-427a-b002-a7293f363d8a" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a"). InnerVolumeSpecName "kube-api-access-dp774". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.729493 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d15ec4b-09ec-427a-b002-a7293f363d8a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1d15ec4b-09ec-427a-b002-a7293f363d8a" (UID: "1d15ec4b-09ec-427a-b002-a7293f363d8a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.764850 4892 generic.go:334] "Generic (PLEG): container finished" podID="1d15ec4b-09ec-427a-b002-a7293f363d8a" containerID="3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542" exitCode=0 Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.764900 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.764924 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" event={"ID":"1d15ec4b-09ec-427a-b002-a7293f363d8a","Type":"ContainerDied","Data":"3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542"} Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.764976 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncchp" event={"ID":"1d15ec4b-09ec-427a-b002-a7293f363d8a","Type":"ContainerDied","Data":"dac654b1a6e948fc45c1ad69427f3e42f6678cf0b0e75dace660fcf6c6a80145"} Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.765008 4892 scope.go:117] "RemoveContainer" containerID="3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.790204 4892 scope.go:117] "RemoveContainer" containerID="3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542" Oct 06 12:17:34 crc kubenswrapper[4892]: E1006 12:17:34.792333 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542\": container with ID starting with 3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542 not found: ID does not exist" containerID="3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.792404 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542"} err="failed to get container status \"3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542\": rpc error: code = NotFound desc = could not find container \"3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542\": container with ID starting with 3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542 not found: ID does not exist" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.811965 4892 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d15ec4b-09ec-427a-b002-a7293f363d8a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.812014 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.812036 4892 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d15ec4b-09ec-427a-b002-a7293f363d8a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.812055 4892 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.812072 4892 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.812088 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp774\" (UniqueName: \"kubernetes.io/projected/1d15ec4b-09ec-427a-b002-a7293f363d8a-kube-api-access-dp774\") on node \"crc\" DevicePath \"\"" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.812106 4892 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d15ec4b-09ec-427a-b002-a7293f363d8a-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.813213 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncchp"] Oct 06 12:17:34 crc kubenswrapper[4892]: I1006 12:17:34.819230 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncchp"] Oct 06 12:17:36 crc kubenswrapper[4892]: I1006 12:17:36.183422 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d15ec4b-09ec-427a-b002-a7293f363d8a" path="/var/lib/kubelet/pods/1d15ec4b-09ec-427a-b002-a7293f363d8a/volumes" Oct 06 12:17:41 crc kubenswrapper[4892]: E1006 12:17:41.119323 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d15ec4b_09ec_427a_b002_a7293f363d8a.slice/crio-3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:17:51 crc kubenswrapper[4892]: E1006 12:17:51.315763 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d15ec4b_09ec_427a_b002_a7293f363d8a.slice/crio-3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:18:01 crc kubenswrapper[4892]: E1006 12:18:01.495975 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d15ec4b_09ec_427a_b002_a7293f363d8a.slice/crio-3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.825801 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hz75l"] Oct 06 12:18:09 crc kubenswrapper[4892]: E1006 12:18:09.826476 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d15ec4b-09ec-427a-b002-a7293f363d8a" containerName="registry" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.826494 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d15ec4b-09ec-427a-b002-a7293f363d8a" containerName="registry" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.826629 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d15ec4b-09ec-427a-b002-a7293f363d8a" containerName="registry" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.827100 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hz75l" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.836482 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.836552 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.836791 4892 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-7bdjj" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.840291 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-86dnp"] Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.840920 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-86dnp" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.843479 4892 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-g5897" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.849076 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hz75l"] Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.856545 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-h8rv5"] Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.857598 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.865724 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-h8rv5"] Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.868658 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-86dnp"] Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.870037 4892 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tv4j2" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.939720 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvjlj\" (UniqueName: \"kubernetes.io/projected/8f1a8124-54fd-486e-90d9-dbe21bed30d8-kube-api-access-tvjlj\") pod \"cert-manager-webhook-5655c58dd6-h8rv5\" (UID: \"8f1a8124-54fd-486e-90d9-dbe21bed30d8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.939767 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jg8\" (UniqueName: \"kubernetes.io/projected/248acf10-be69-4f77-8101-d6e3f8a454d6-kube-api-access-l9jg8\") pod \"cert-manager-cainjector-7f985d654d-hz75l\" (UID: \"248acf10-be69-4f77-8101-d6e3f8a454d6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hz75l" Oct 06 12:18:09 crc kubenswrapper[4892]: I1006 12:18:09.939816 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msqnz\" (UniqueName: \"kubernetes.io/projected/f92d7a66-86e9-4f49-9797-d0714a72e329-kube-api-access-msqnz\") pod \"cert-manager-5b446d88c5-86dnp\" (UID: \"f92d7a66-86e9-4f49-9797-d0714a72e329\") " pod="cert-manager/cert-manager-5b446d88c5-86dnp" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.041791 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvjlj\" (UniqueName: \"kubernetes.io/projected/8f1a8124-54fd-486e-90d9-dbe21bed30d8-kube-api-access-tvjlj\") pod \"cert-manager-webhook-5655c58dd6-h8rv5\" (UID: \"8f1a8124-54fd-486e-90d9-dbe21bed30d8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.041871 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jg8\" (UniqueName: \"kubernetes.io/projected/248acf10-be69-4f77-8101-d6e3f8a454d6-kube-api-access-l9jg8\") pod \"cert-manager-cainjector-7f985d654d-hz75l\" (UID: \"248acf10-be69-4f77-8101-d6e3f8a454d6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hz75l" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.041965 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msqnz\" (UniqueName: \"kubernetes.io/projected/f92d7a66-86e9-4f49-9797-d0714a72e329-kube-api-access-msqnz\") pod \"cert-manager-5b446d88c5-86dnp\" (UID: \"f92d7a66-86e9-4f49-9797-d0714a72e329\") " pod="cert-manager/cert-manager-5b446d88c5-86dnp" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.065169 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvjlj\" (UniqueName: \"kubernetes.io/projected/8f1a8124-54fd-486e-90d9-dbe21bed30d8-kube-api-access-tvjlj\") pod \"cert-manager-webhook-5655c58dd6-h8rv5\" (UID: \"8f1a8124-54fd-486e-90d9-dbe21bed30d8\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.073053 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jg8\" (UniqueName: \"kubernetes.io/projected/248acf10-be69-4f77-8101-d6e3f8a454d6-kube-api-access-l9jg8\") pod \"cert-manager-cainjector-7f985d654d-hz75l\" (UID: \"248acf10-be69-4f77-8101-d6e3f8a454d6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-hz75l" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.073505 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msqnz\" (UniqueName: \"kubernetes.io/projected/f92d7a66-86e9-4f49-9797-d0714a72e329-kube-api-access-msqnz\") pod \"cert-manager-5b446d88c5-86dnp\" (UID: \"f92d7a66-86e9-4f49-9797-d0714a72e329\") " pod="cert-manager/cert-manager-5b446d88c5-86dnp" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.144722 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-hz75l" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.161234 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-86dnp" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.173819 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.488066 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-h8rv5"] Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.507787 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.631943 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-86dnp"] Oct 06 12:18:10 crc kubenswrapper[4892]: W1006 12:18:10.635420 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf92d7a66_86e9_4f49_9797_d0714a72e329.slice/crio-30baac785e9c2326f0472731a015c275eb29ee2ac60f19478631ddeea6d8f130 WatchSource:0}: Error finding container 30baac785e9c2326f0472731a015c275eb29ee2ac60f19478631ddeea6d8f130: Status 404 returned error can't find the container with id 30baac785e9c2326f0472731a015c275eb29ee2ac60f19478631ddeea6d8f130 Oct 06 12:18:10 crc kubenswrapper[4892]: I1006 12:18:10.635653 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-hz75l"] Oct 06 12:18:10 crc kubenswrapper[4892]: W1006 12:18:10.637182 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod248acf10_be69_4f77_8101_d6e3f8a454d6.slice/crio-2548d1ed8adfb8816bb9bb072c0d8e64e512a361b73ee185264754623ec6de88 WatchSource:0}: Error finding container 2548d1ed8adfb8816bb9bb072c0d8e64e512a361b73ee185264754623ec6de88: Status 404 returned error can't find the container with id 2548d1ed8adfb8816bb9bb072c0d8e64e512a361b73ee185264754623ec6de88 Oct 06 12:18:11 crc kubenswrapper[4892]: I1006 12:18:11.001992 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hz75l" event={"ID":"248acf10-be69-4f77-8101-d6e3f8a454d6","Type":"ContainerStarted","Data":"2548d1ed8adfb8816bb9bb072c0d8e64e512a361b73ee185264754623ec6de88"} Oct 06 12:18:11 crc kubenswrapper[4892]: I1006 12:18:11.003161 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" event={"ID":"8f1a8124-54fd-486e-90d9-dbe21bed30d8","Type":"ContainerStarted","Data":"6678db329b87798f81ba9079943b7e01e2ac8229dcfe4490239d29007bb94846"} Oct 06 12:18:11 crc kubenswrapper[4892]: I1006 12:18:11.004245 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-86dnp" event={"ID":"f92d7a66-86e9-4f49-9797-d0714a72e329","Type":"ContainerStarted","Data":"30baac785e9c2326f0472731a015c275eb29ee2ac60f19478631ddeea6d8f130"} Oct 06 12:18:11 crc kubenswrapper[4892]: E1006 12:18:11.668284 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d15ec4b_09ec_427a_b002_a7293f363d8a.slice/crio-3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:18:13 crc kubenswrapper[4892]: I1006 12:18:13.020591 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-hz75l" event={"ID":"248acf10-be69-4f77-8101-d6e3f8a454d6","Type":"ContainerStarted","Data":"135cde91fac4a0f9d210fdfb66f779e7cfc509f40fda4e3c4372dd51233f6bbc"} Oct 06 12:18:13 crc kubenswrapper[4892]: I1006 12:18:13.041196 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-hz75l" podStartSLOduration=1.91999169 podStartE2EDuration="4.04117625s" podCreationTimestamp="2025-10-06 12:18:09 +0000 UTC" firstStartedPulling="2025-10-06 12:18:10.641332048 +0000 UTC m=+577.191037813" lastFinishedPulling="2025-10-06 12:18:12.762516598 +0000 UTC m=+579.312222373" observedRunningTime="2025-10-06 12:18:13.039179473 +0000 UTC m=+579.588885238" watchObservedRunningTime="2025-10-06 12:18:13.04117625 +0000 UTC m=+579.590882015" Oct 06 12:18:15 crc kubenswrapper[4892]: I1006 12:18:15.040240 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" event={"ID":"8f1a8124-54fd-486e-90d9-dbe21bed30d8","Type":"ContainerStarted","Data":"f73336b79eb04ac6ad044f0bf961a4b76362b59476c216225b5087a25cd3c365"} Oct 06 12:18:15 crc kubenswrapper[4892]: I1006 12:18:15.040598 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" Oct 06 12:18:15 crc kubenswrapper[4892]: I1006 12:18:15.042689 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-86dnp" event={"ID":"f92d7a66-86e9-4f49-9797-d0714a72e329","Type":"ContainerStarted","Data":"d2fea443df002243a359cbaed79cc7cb734fd94d9accedf3c833017246b54406"} Oct 06 12:18:15 crc kubenswrapper[4892]: I1006 12:18:15.069358 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" podStartSLOduration=2.607849807 podStartE2EDuration="6.069287948s" podCreationTimestamp="2025-10-06 12:18:09 +0000 UTC" firstStartedPulling="2025-10-06 12:18:10.507448332 +0000 UTC m=+577.057154097" lastFinishedPulling="2025-10-06 12:18:13.968886443 +0000 UTC m=+580.518592238" observedRunningTime="2025-10-06 12:18:15.062436053 +0000 UTC m=+581.612141818" watchObservedRunningTime="2025-10-06 12:18:15.069287948 +0000 UTC m=+581.618993753" Oct 06 12:18:15 crc kubenswrapper[4892]: I1006 12:18:15.091493 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-86dnp" podStartSLOduration=2.701365272 podStartE2EDuration="6.09145954s" podCreationTimestamp="2025-10-06 12:18:09 +0000 UTC" firstStartedPulling="2025-10-06 12:18:10.637223081 +0000 UTC m=+577.186928846" lastFinishedPulling="2025-10-06 12:18:14.027317309 +0000 UTC m=+580.577023114" observedRunningTime="2025-10-06 12:18:15.088038973 +0000 UTC m=+581.637744778" watchObservedRunningTime="2025-10-06 12:18:15.09145954 +0000 UTC m=+581.641165345" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.180648 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-h8rv5" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.463607 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cxmhh"] Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.464204 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovn-controller" containerID="cri-o://ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0" gracePeriod=30 Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.464307 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa" gracePeriod=30 Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.464400 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="northd" containerID="cri-o://a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360" gracePeriod=30 Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.464424 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kube-rbac-proxy-node" containerID="cri-o://e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010" gracePeriod=30 Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.464459 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovn-acl-logging" containerID="cri-o://a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc" gracePeriod=30 Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.464386 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="nbdb" containerID="cri-o://8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe" gracePeriod=30 Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.464753 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="sbdb" containerID="cri-o://6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56" gracePeriod=30 Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.514919 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" containerID="cri-o://aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" gracePeriod=30 Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.808607 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/3.log" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.812127 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovn-acl-logging/0.log" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.813037 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovn-controller/0.log" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.813677 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.896380 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dqx5k"] Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.896726 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovn-acl-logging" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.896769 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovn-acl-logging" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.896788 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kubecfg-setup" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.896806 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kubecfg-setup" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.896831 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="nbdb" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.896849 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="nbdb" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.896873 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.896889 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.896920 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="northd" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.896938 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="northd" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.896966 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.896982 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.897002 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="sbdb" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897017 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="sbdb" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.897036 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897053 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.897075 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897092 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.897117 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kube-rbac-proxy-node" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897134 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kube-rbac-proxy-node" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.897161 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovn-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897179 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovn-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.897203 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897218 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897469 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897492 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="sbdb" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897513 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897536 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kube-rbac-proxy-node" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897561 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovn-acl-logging" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897585 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="nbdb" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897606 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovn-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897623 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="kube-rbac-proxy-ovn-metrics" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897654 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="northd" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897672 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: E1006 12:18:20.897901 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.897924 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.898207 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.898233 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerName="ovnkube-controller" Oct 06 12:18:20 crc kubenswrapper[4892]: I1006 12:18:20.902057 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.000418 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-kubelet\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.000497 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-etc-openvswitch\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.000525 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.000588 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-netd\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.000644 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-var-lib-openvswitch\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.000671 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.000673 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.000747 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001516 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-netns\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001553 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001705 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001635 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-ovn\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001865 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-node-log\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001927 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-systemd-units\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001936 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-node-log" (OuterVolumeSpecName: "node-log") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001989 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.001998 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovn-node-metrics-cert\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002046 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-log-socket\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002098 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-env-overrides\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002144 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-log-socket" (OuterVolumeSpecName: "log-socket") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002173 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-ovn-kubernetes\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002309 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002399 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002470 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002425 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-slash\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002541 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-config\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002580 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-bin\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002584 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-slash" (OuterVolumeSpecName: "host-slash") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002609 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-systemd\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002640 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-openvswitch\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002675 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swtk8\" (UniqueName: \"kubernetes.io/projected/e115ba33-9ba0-42d6-82a0-09ef8c996788-kube-api-access-swtk8\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002710 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-script-lib\") pod \"e115ba33-9ba0-42d6-82a0-09ef8c996788\" (UID: \"e115ba33-9ba0-42d6-82a0-09ef8c996788\") " Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002640 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002946 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-systemd\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.002996 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-cni-bin\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003043 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-slash\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003094 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovnkube-config\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003250 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-cni-netd\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003283 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003382 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px5xm\" (UniqueName: \"kubernetes.io/projected/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-kube-api-access-px5xm\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003427 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003456 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovnkube-script-lib\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003517 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003612 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003646 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-log-socket\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003699 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovn-node-metrics-cert\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003733 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-systemd-units\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003778 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003834 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-etc-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003883 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-kubelet\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.003913 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-run-ovn-kubernetes\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004039 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-env-overrides\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004096 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-run-netns\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004143 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-var-lib-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004195 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004273 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-ovn\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004386 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-node-log\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004491 4892 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004519 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004537 4892 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004555 4892 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004574 4892 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004590 4892 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004608 4892 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004623 4892 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004640 4892 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-node-log\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004655 4892 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004672 4892 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-log-socket\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004687 4892 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004704 4892 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004722 4892 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004739 4892 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-slash\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004756 4892 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.004773 4892 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.008665 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.008721 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e115ba33-9ba0-42d6-82a0-09ef8c996788-kube-api-access-swtk8" (OuterVolumeSpecName: "kube-api-access-swtk8") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "kube-api-access-swtk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.018426 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e115ba33-9ba0-42d6-82a0-09ef8c996788" (UID: "e115ba33-9ba0-42d6-82a0-09ef8c996788"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.084922 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovnkube-controller/3.log" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.088481 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovn-acl-logging/0.log" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089097 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-cxmhh_e115ba33-9ba0-42d6-82a0-09ef8c996788/ovn-controller/0.log" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089666 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" exitCode=0 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089710 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56" exitCode=0 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089725 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe" exitCode=0 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089738 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360" exitCode=0 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089753 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa" exitCode=0 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089768 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010" exitCode=0 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089763 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089824 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089836 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089861 4892 scope.go:117] "RemoveContainer" containerID="aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089782 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc" exitCode=143 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089892 4892 generic.go:334] "Generic (PLEG): container finished" podID="e115ba33-9ba0-42d6-82a0-09ef8c996788" containerID="ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0" exitCode=143 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.089845 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090065 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090121 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090142 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090162 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090179 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090191 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090205 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090216 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090227 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090238 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090249 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090259 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090273 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090289 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090301 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090313 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090353 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090364 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090375 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090385 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090396 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090407 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090417 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090432 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090449 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090463 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090476 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090487 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090498 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090508 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090518 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090531 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090542 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090552 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090568 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-cxmhh" event={"ID":"e115ba33-9ba0-42d6-82a0-09ef8c996788","Type":"ContainerDied","Data":"78d656be122ec40f7a14867c983b72f6b24e17f230349e4654869a98b3b017a7"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090583 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090596 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090608 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090619 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090631 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090642 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090653 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090663 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090673 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.090684 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.092242 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/2.log" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.093891 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/1.log" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.093949 4892 generic.go:334] "Generic (PLEG): container finished" podID="df1cea25-4170-457d-b579-2678161d7d53" containerID="1f18db21adfe184eeb4fb4e20b10f5e36bb64cb873e1ad45648cc412d4cba0eb" exitCode=2 Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.093998 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zfsp" event={"ID":"df1cea25-4170-457d-b579-2678161d7d53","Type":"ContainerDied","Data":"1f18db21adfe184eeb4fb4e20b10f5e36bb64cb873e1ad45648cc412d4cba0eb"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.094034 4892 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe"} Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.095057 4892 scope.go:117] "RemoveContainer" containerID="1f18db21adfe184eeb4fb4e20b10f5e36bb64cb873e1ad45648cc412d4cba0eb" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.095369 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5zfsp_openshift-multus(df1cea25-4170-457d-b579-2678161d7d53)\"" pod="openshift-multus/multus-5zfsp" podUID="df1cea25-4170-457d-b579-2678161d7d53" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.106917 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px5xm\" (UniqueName: \"kubernetes.io/projected/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-kube-api-access-px5xm\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107037 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovnkube-script-lib\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107161 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-log-socket\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107211 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovn-node-metrics-cert\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107242 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-systemd-units\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107273 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107348 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-etc-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107381 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-kubelet\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107410 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-run-ovn-kubernetes\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107459 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-env-overrides\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107490 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-run-netns\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107518 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-var-lib-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107547 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107584 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-ovn\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107611 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-node-log\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107676 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-systemd\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107708 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-cni-bin\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107742 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-slash\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107771 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovnkube-config\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.107801 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-cni-netd\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108526 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-env-overrides\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-run-netns\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108159 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-var-lib-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108208 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108246 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovnkube-script-lib\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108254 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-ovn\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108288 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-cni-bin\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108346 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-systemd\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108378 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-kubelet\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108407 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-etc-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108435 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-slash\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108038 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-node-log\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108938 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-systemd-units\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108962 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-log-socket\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.109004 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-cni-netd\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.108967 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-host-run-ovn-kubernetes\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.109018 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-run-openvswitch\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.109269 4892 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e115ba33-9ba0-42d6-82a0-09ef8c996788-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.109306 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swtk8\" (UniqueName: \"kubernetes.io/projected/e115ba33-9ba0-42d6-82a0-09ef8c996788-kube-api-access-swtk8\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.109352 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e115ba33-9ba0-42d6-82a0-09ef8c996788-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.109697 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovnkube-config\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.112675 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-ovn-node-metrics-cert\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.121743 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.131937 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px5xm\" (UniqueName: \"kubernetes.io/projected/d1e0cd37-7808-4a97-a82c-d35e17bd0fd5-kube-api-access-px5xm\") pod \"ovnkube-node-dqx5k\" (UID: \"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5\") " pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.147449 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cxmhh"] Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.152426 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-cxmhh"] Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.170527 4892 scope.go:117] "RemoveContainer" containerID="6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.185992 4892 scope.go:117] "RemoveContainer" containerID="8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.203460 4892 scope.go:117] "RemoveContainer" containerID="a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.221018 4892 scope.go:117] "RemoveContainer" containerID="3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.222682 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.252150 4892 scope.go:117] "RemoveContainer" containerID="e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.274414 4892 scope.go:117] "RemoveContainer" containerID="a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.300249 4892 scope.go:117] "RemoveContainer" containerID="ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.325158 4892 scope.go:117] "RemoveContainer" containerID="153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.341767 4892 scope.go:117] "RemoveContainer" containerID="aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.342128 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": container with ID starting with aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a not found: ID does not exist" containerID="aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.342152 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} err="failed to get container status \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": rpc error: code = NotFound desc = could not find container \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": container with ID starting with aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.342176 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.342693 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": container with ID starting with f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d not found: ID does not exist" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.342711 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} err="failed to get container status \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": rpc error: code = NotFound desc = could not find container \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": container with ID starting with f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.342723 4892 scope.go:117] "RemoveContainer" containerID="6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.343114 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": container with ID starting with 6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56 not found: ID does not exist" containerID="6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.343162 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} err="failed to get container status \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": rpc error: code = NotFound desc = could not find container \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": container with ID starting with 6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.343201 4892 scope.go:117] "RemoveContainer" containerID="8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.343590 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": container with ID starting with 8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe not found: ID does not exist" containerID="8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.343646 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} err="failed to get container status \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": rpc error: code = NotFound desc = could not find container \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": container with ID starting with 8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.343673 4892 scope.go:117] "RemoveContainer" containerID="a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.344066 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": container with ID starting with a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360 not found: ID does not exist" containerID="a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.344096 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} err="failed to get container status \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": rpc error: code = NotFound desc = could not find container \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": container with ID starting with a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.344121 4892 scope.go:117] "RemoveContainer" containerID="3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.344475 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": container with ID starting with 3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa not found: ID does not exist" containerID="3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.344495 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} err="failed to get container status \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": rpc error: code = NotFound desc = could not find container \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": container with ID starting with 3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.344509 4892 scope.go:117] "RemoveContainer" containerID="e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.344921 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": container with ID starting with e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010 not found: ID does not exist" containerID="e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.344981 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} err="failed to get container status \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": rpc error: code = NotFound desc = could not find container \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": container with ID starting with e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.345018 4892 scope.go:117] "RemoveContainer" containerID="a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.345465 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": container with ID starting with a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc not found: ID does not exist" containerID="a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.345517 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} err="failed to get container status \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": rpc error: code = NotFound desc = could not find container \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": container with ID starting with a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.345553 4892 scope.go:117] "RemoveContainer" containerID="ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.345909 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": container with ID starting with ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0 not found: ID does not exist" containerID="ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.345951 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} err="failed to get container status \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": rpc error: code = NotFound desc = could not find container \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": container with ID starting with ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.345977 4892 scope.go:117] "RemoveContainer" containerID="153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.346220 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": container with ID starting with 153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b not found: ID does not exist" containerID="153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.346259 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} err="failed to get container status \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": rpc error: code = NotFound desc = could not find container \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": container with ID starting with 153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.346284 4892 scope.go:117] "RemoveContainer" containerID="aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.346588 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} err="failed to get container status \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": rpc error: code = NotFound desc = could not find container \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": container with ID starting with aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.346627 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.346886 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} err="failed to get container status \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": rpc error: code = NotFound desc = could not find container \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": container with ID starting with f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.347124 4892 scope.go:117] "RemoveContainer" containerID="6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.347667 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} err="failed to get container status \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": rpc error: code = NotFound desc = could not find container \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": container with ID starting with 6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.347712 4892 scope.go:117] "RemoveContainer" containerID="8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.347955 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} err="failed to get container status \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": rpc error: code = NotFound desc = could not find container \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": container with ID starting with 8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.347988 4892 scope.go:117] "RemoveContainer" containerID="a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.348276 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} err="failed to get container status \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": rpc error: code = NotFound desc = could not find container \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": container with ID starting with a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.348309 4892 scope.go:117] "RemoveContainer" containerID="3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.348586 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} err="failed to get container status \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": rpc error: code = NotFound desc = could not find container \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": container with ID starting with 3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.348622 4892 scope.go:117] "RemoveContainer" containerID="e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.348892 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} err="failed to get container status \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": rpc error: code = NotFound desc = could not find container \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": container with ID starting with e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.348926 4892 scope.go:117] "RemoveContainer" containerID="a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.349164 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} err="failed to get container status \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": rpc error: code = NotFound desc = could not find container \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": container with ID starting with a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.349193 4892 scope.go:117] "RemoveContainer" containerID="ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.349466 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} err="failed to get container status \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": rpc error: code = NotFound desc = could not find container \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": container with ID starting with ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.349506 4892 scope.go:117] "RemoveContainer" containerID="153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.349788 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} err="failed to get container status \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": rpc error: code = NotFound desc = could not find container \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": container with ID starting with 153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.349809 4892 scope.go:117] "RemoveContainer" containerID="aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.350105 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} err="failed to get container status \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": rpc error: code = NotFound desc = could not find container \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": container with ID starting with aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.350178 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.350480 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} err="failed to get container status \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": rpc error: code = NotFound desc = could not find container \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": container with ID starting with f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.350500 4892 scope.go:117] "RemoveContainer" containerID="6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.350759 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} err="failed to get container status \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": rpc error: code = NotFound desc = could not find container \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": container with ID starting with 6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.350793 4892 scope.go:117] "RemoveContainer" containerID="8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.351043 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} err="failed to get container status \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": rpc error: code = NotFound desc = could not find container \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": container with ID starting with 8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.351077 4892 scope.go:117] "RemoveContainer" containerID="a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.351460 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} err="failed to get container status \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": rpc error: code = NotFound desc = could not find container \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": container with ID starting with a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.351484 4892 scope.go:117] "RemoveContainer" containerID="3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.351914 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} err="failed to get container status \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": rpc error: code = NotFound desc = could not find container \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": container with ID starting with 3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.351961 4892 scope.go:117] "RemoveContainer" containerID="e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.352619 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} err="failed to get container status \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": rpc error: code = NotFound desc = could not find container \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": container with ID starting with e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.352663 4892 scope.go:117] "RemoveContainer" containerID="a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.352951 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} err="failed to get container status \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": rpc error: code = NotFound desc = could not find container \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": container with ID starting with a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.352975 4892 scope.go:117] "RemoveContainer" containerID="ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.353237 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} err="failed to get container status \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": rpc error: code = NotFound desc = could not find container \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": container with ID starting with ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.353272 4892 scope.go:117] "RemoveContainer" containerID="153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.353644 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} err="failed to get container status \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": rpc error: code = NotFound desc = could not find container \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": container with ID starting with 153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.353682 4892 scope.go:117] "RemoveContainer" containerID="aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.353971 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} err="failed to get container status \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": rpc error: code = NotFound desc = could not find container \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": container with ID starting with aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.354008 4892 scope.go:117] "RemoveContainer" containerID="f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.354273 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d"} err="failed to get container status \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": rpc error: code = NotFound desc = could not find container \"f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d\": container with ID starting with f4f257076c02816f8d37c9b17056b29f17144e96d22a94531f0eeba1445e5e7d not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.354306 4892 scope.go:117] "RemoveContainer" containerID="6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.354630 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56"} err="failed to get container status \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": rpc error: code = NotFound desc = could not find container \"6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56\": container with ID starting with 6f24250a64dfa78bde452852f364712de6b609b4413049a6adb48355efe3be56 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.354667 4892 scope.go:117] "RemoveContainer" containerID="8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.355146 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe"} err="failed to get container status \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": rpc error: code = NotFound desc = could not find container \"8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe\": container with ID starting with 8f91591b9f4a3e02446430180516d5add314fd298856dddbcbef12d4dcbdcdbe not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.355185 4892 scope.go:117] "RemoveContainer" containerID="a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.355585 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360"} err="failed to get container status \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": rpc error: code = NotFound desc = could not find container \"a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360\": container with ID starting with a3e764c48df131a599c24f150e9f7fc55f7c567799ea4a6a86f8942e57b90360 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.355634 4892 scope.go:117] "RemoveContainer" containerID="3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.355945 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa"} err="failed to get container status \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": rpc error: code = NotFound desc = could not find container \"3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa\": container with ID starting with 3f918acd574e9b0bbef32a8b1ed7df29748a3661f7b8d81806846ece707aa8aa not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.355970 4892 scope.go:117] "RemoveContainer" containerID="e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.356287 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010"} err="failed to get container status \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": rpc error: code = NotFound desc = could not find container \"e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010\": container with ID starting with e5241db3b44c688ec2ff5692f1365b72b395a398a5355a7d889acbc12cc53010 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.356381 4892 scope.go:117] "RemoveContainer" containerID="a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.356761 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc"} err="failed to get container status \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": rpc error: code = NotFound desc = could not find container \"a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc\": container with ID starting with a8b0b1b5ad1d97cb099b40f9b8faeb0011af085f133773f14dc982e83ebd9cfc not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.356799 4892 scope.go:117] "RemoveContainer" containerID="ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.357136 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0"} err="failed to get container status \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": rpc error: code = NotFound desc = could not find container \"ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0\": container with ID starting with ca3635b186f7fc87984c3e32b522b569ef27323ccd967b061601b5dd18c18bd0 not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.357187 4892 scope.go:117] "RemoveContainer" containerID="153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.357630 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b"} err="failed to get container status \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": rpc error: code = NotFound desc = could not find container \"153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b\": container with ID starting with 153d9a8781dd17c6ab56f435ec85252c35aeeda5036481bb3db4030f2ed9e24b not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.357653 4892 scope.go:117] "RemoveContainer" containerID="aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a" Oct 06 12:18:21 crc kubenswrapper[4892]: I1006 12:18:21.357935 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a"} err="failed to get container status \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": rpc error: code = NotFound desc = could not find container \"aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a\": container with ID starting with aad928142a1fe6dd66a93a0e1875c27a184b7c08c59a9899ef3bd2c4b966dd9a not found: ID does not exist" Oct 06 12:18:21 crc kubenswrapper[4892]: E1006 12:18:21.779421 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d15ec4b_09ec_427a_b002_a7293f363d8a.slice/crio-3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:18:22 crc kubenswrapper[4892]: I1006 12:18:22.103101 4892 generic.go:334] "Generic (PLEG): container finished" podID="d1e0cd37-7808-4a97-a82c-d35e17bd0fd5" containerID="3833f0dcc965edab22e182c47d182a1afabc7bbf0fe9d88e201487f79732a43e" exitCode=0 Oct 06 12:18:22 crc kubenswrapper[4892]: I1006 12:18:22.103239 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerDied","Data":"3833f0dcc965edab22e182c47d182a1afabc7bbf0fe9d88e201487f79732a43e"} Oct 06 12:18:22 crc kubenswrapper[4892]: I1006 12:18:22.103623 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"0a05d10929c6e7a61f7573fc76831d59b245fd97c2efc66ab17b42ff4d6d63ed"} Oct 06 12:18:22 crc kubenswrapper[4892]: I1006 12:18:22.176659 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e115ba33-9ba0-42d6-82a0-09ef8c996788" path="/var/lib/kubelet/pods/e115ba33-9ba0-42d6-82a0-09ef8c996788/volumes" Oct 06 12:18:22 crc kubenswrapper[4892]: I1006 12:18:22.984762 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:18:22 crc kubenswrapper[4892]: I1006 12:18:22.985122 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:18:23 crc kubenswrapper[4892]: I1006 12:18:23.116016 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"e04b634b685f39b48c9892729dc65aba186aa2dba6c88f3ecf907ef9a43d5f59"} Oct 06 12:18:23 crc kubenswrapper[4892]: I1006 12:18:23.116070 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"fed33d511c8939c11009e65a350fee2a17b5a54469c924d854f4b43dfc093f1f"} Oct 06 12:18:23 crc kubenswrapper[4892]: I1006 12:18:23.116085 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"8593989f0b7b351c59c24d05a415f08920272970ebb83739955fc20f02d12cea"} Oct 06 12:18:23 crc kubenswrapper[4892]: I1006 12:18:23.116097 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"e7745df626abcb8fb237c342c4f983b6328e15162956e8aec2c255a97edb23d8"} Oct 06 12:18:23 crc kubenswrapper[4892]: I1006 12:18:23.116108 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"0fbe9c2798867f2a1e8c76fe09ec67ecb45952e9eac3cd3fbe94a8dc163c0bef"} Oct 06 12:18:23 crc kubenswrapper[4892]: I1006 12:18:23.116119 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"5844c5b84adf79d8236382f0262662349d781c8d2cee89c6f754b786ea2ba01a"} Oct 06 12:18:26 crc kubenswrapper[4892]: I1006 12:18:26.143184 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"93c1937e1d2ffcb874896d6a01341c20c1f360f1f53ad29be980fa6c94abc16f"} Oct 06 12:18:28 crc kubenswrapper[4892]: I1006 12:18:28.166023 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" event={"ID":"d1e0cd37-7808-4a97-a82c-d35e17bd0fd5","Type":"ContainerStarted","Data":"6ae45c359964e3a543652b17e1f7e9c31ab45f1ff1ff107c9e08bb7c017514eb"} Oct 06 12:18:28 crc kubenswrapper[4892]: I1006 12:18:28.166374 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:28 crc kubenswrapper[4892]: I1006 12:18:28.206923 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" podStartSLOduration=8.20688524 podStartE2EDuration="8.20688524s" podCreationTimestamp="2025-10-06 12:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:18:28.193759017 +0000 UTC m=+594.743464842" watchObservedRunningTime="2025-10-06 12:18:28.20688524 +0000 UTC m=+594.756591045" Oct 06 12:18:28 crc kubenswrapper[4892]: I1006 12:18:28.209379 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:29 crc kubenswrapper[4892]: I1006 12:18:29.176046 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:29 crc kubenswrapper[4892]: I1006 12:18:29.176102 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:29 crc kubenswrapper[4892]: I1006 12:18:29.216957 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:31 crc kubenswrapper[4892]: E1006 12:18:31.914422 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d15ec4b_09ec_427a_b002_a7293f363d8a.slice/crio-3ebf646410a0852c817bb7e6e87c3c82c6538dbc696aca2999a7803c354ae542.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:18:34 crc kubenswrapper[4892]: I1006 12:18:34.356308 4892 scope.go:117] "RemoveContainer" containerID="0674128efdea7d4b2f70f3aa6375cb6d381ccfa6e4892f2f2c3037e7d31720fe" Oct 06 12:18:35 crc kubenswrapper[4892]: I1006 12:18:35.168528 4892 scope.go:117] "RemoveContainer" containerID="1f18db21adfe184eeb4fb4e20b10f5e36bb64cb873e1ad45648cc412d4cba0eb" Oct 06 12:18:35 crc kubenswrapper[4892]: E1006 12:18:35.169308 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-5zfsp_openshift-multus(df1cea25-4170-457d-b579-2678161d7d53)\"" pod="openshift-multus/multus-5zfsp" podUID="df1cea25-4170-457d-b579-2678161d7d53" Oct 06 12:18:35 crc kubenswrapper[4892]: I1006 12:18:35.214730 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/2.log" Oct 06 12:18:48 crc kubenswrapper[4892]: I1006 12:18:48.168851 4892 scope.go:117] "RemoveContainer" containerID="1f18db21adfe184eeb4fb4e20b10f5e36bb64cb873e1ad45648cc412d4cba0eb" Oct 06 12:18:48 crc kubenswrapper[4892]: I1006 12:18:48.953762 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8"] Oct 06 12:18:48 crc kubenswrapper[4892]: I1006 12:18:48.957281 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:48 crc kubenswrapper[4892]: I1006 12:18:48.965963 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 12:18:48 crc kubenswrapper[4892]: I1006 12:18:48.973180 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8"] Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.031599 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.031682 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c8r9\" (UniqueName: \"kubernetes.io/projected/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-kube-api-access-5c8r9\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.031869 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.133469 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c8r9\" (UniqueName: \"kubernetes.io/projected/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-kube-api-access-5c8r9\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.133674 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.133790 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.134584 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.134731 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.167529 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c8r9\" (UniqueName: \"kubernetes.io/projected/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-kube-api-access-5c8r9\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.291121 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.324558 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-5zfsp_df1cea25-4170-457d-b579-2678161d7d53/kube-multus/2.log" Oct 06 12:18:49 crc kubenswrapper[4892]: I1006 12:18:49.324630 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-5zfsp" event={"ID":"df1cea25-4170-457d-b579-2678161d7d53","Type":"ContainerStarted","Data":"eec3957c61d115fb3e723863661d7f73acea3cf86ce0f5e6188553f5feeac8cb"} Oct 06 12:18:49 crc kubenswrapper[4892]: E1006 12:18:49.352064 4892 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_openshift-marketplace_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24_0(08d495bee7dde1353a9e20445c7e5b4fd31709c186249a1b40c95da140d412b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 06 12:18:49 crc kubenswrapper[4892]: E1006 12:18:49.352165 4892 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_openshift-marketplace_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24_0(08d495bee7dde1353a9e20445c7e5b4fd31709c186249a1b40c95da140d412b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: E1006 12:18:49.352206 4892 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_openshift-marketplace_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24_0(08d495bee7dde1353a9e20445c7e5b4fd31709c186249a1b40c95da140d412b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:49 crc kubenswrapper[4892]: E1006 12:18:49.352275 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_openshift-marketplace(b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_openshift-marketplace(b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_openshift-marketplace_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24_0(08d495bee7dde1353a9e20445c7e5b4fd31709c186249a1b40c95da140d412b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" podUID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" Oct 06 12:18:50 crc kubenswrapper[4892]: I1006 12:18:50.331287 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:50 crc kubenswrapper[4892]: I1006 12:18:50.331972 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:50 crc kubenswrapper[4892]: I1006 12:18:50.599535 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8"] Oct 06 12:18:50 crc kubenswrapper[4892]: W1006 12:18:50.610654 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0e4cf48_a27b_4fa1_89ba_efeb6dc3bf24.slice/crio-d3d3ebe6a3b0cfbce24e0354b5a479a3a59167be756be28d7da94eeab23794f3 WatchSource:0}: Error finding container d3d3ebe6a3b0cfbce24e0354b5a479a3a59167be756be28d7da94eeab23794f3: Status 404 returned error can't find the container with id d3d3ebe6a3b0cfbce24e0354b5a479a3a59167be756be28d7da94eeab23794f3 Oct 06 12:18:51 crc kubenswrapper[4892]: I1006 12:18:51.245642 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dqx5k" Oct 06 12:18:51 crc kubenswrapper[4892]: I1006 12:18:51.337849 4892 generic.go:334] "Generic (PLEG): container finished" podID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerID="ec94222db48367182539b25357199becda14146e088a79aeb4c98d1c573ddfdf" exitCode=0 Oct 06 12:18:51 crc kubenswrapper[4892]: I1006 12:18:51.337897 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" event={"ID":"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24","Type":"ContainerDied","Data":"ec94222db48367182539b25357199becda14146e088a79aeb4c98d1c573ddfdf"} Oct 06 12:18:51 crc kubenswrapper[4892]: I1006 12:18:51.337924 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" event={"ID":"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24","Type":"ContainerStarted","Data":"d3d3ebe6a3b0cfbce24e0354b5a479a3a59167be756be28d7da94eeab23794f3"} Oct 06 12:18:52 crc kubenswrapper[4892]: I1006 12:18:52.984597 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:18:52 crc kubenswrapper[4892]: I1006 12:18:52.984920 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:18:53 crc kubenswrapper[4892]: I1006 12:18:53.355895 4892 generic.go:334] "Generic (PLEG): container finished" podID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerID="72317db388e5a5f577c8d8532dfc09ee29a050d439fd167621fd2ca418ad22b4" exitCode=0 Oct 06 12:18:53 crc kubenswrapper[4892]: I1006 12:18:53.356060 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" event={"ID":"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24","Type":"ContainerDied","Data":"72317db388e5a5f577c8d8532dfc09ee29a050d439fd167621fd2ca418ad22b4"} Oct 06 12:18:54 crc kubenswrapper[4892]: I1006 12:18:54.366873 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" event={"ID":"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24","Type":"ContainerStarted","Data":"91049d4ec629429632227e79e4b853a50c6c13cabb1059565a07f763b92adf25"} Oct 06 12:18:55 crc kubenswrapper[4892]: I1006 12:18:55.377103 4892 generic.go:334] "Generic (PLEG): container finished" podID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerID="91049d4ec629429632227e79e4b853a50c6c13cabb1059565a07f763b92adf25" exitCode=0 Oct 06 12:18:55 crc kubenswrapper[4892]: I1006 12:18:55.377174 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" event={"ID":"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24","Type":"ContainerDied","Data":"91049d4ec629429632227e79e4b853a50c6c13cabb1059565a07f763b92adf25"} Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.703056 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.779035 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-bundle\") pod \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.779130 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c8r9\" (UniqueName: \"kubernetes.io/projected/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-kube-api-access-5c8r9\") pod \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.779181 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-util\") pod \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\" (UID: \"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24\") " Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.782141 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-bundle" (OuterVolumeSpecName: "bundle") pod "b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" (UID: "b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.788012 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-kube-api-access-5c8r9" (OuterVolumeSpecName: "kube-api-access-5c8r9") pod "b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" (UID: "b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24"). InnerVolumeSpecName "kube-api-access-5c8r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.793877 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-util" (OuterVolumeSpecName: "util") pod "b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" (UID: "b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.881061 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-util\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.881089 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:56 crc kubenswrapper[4892]: I1006 12:18:56.881098 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c8r9\" (UniqueName: \"kubernetes.io/projected/b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24-kube-api-access-5c8r9\") on node \"crc\" DevicePath \"\"" Oct 06 12:18:57 crc kubenswrapper[4892]: I1006 12:18:57.395023 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" event={"ID":"b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24","Type":"ContainerDied","Data":"d3d3ebe6a3b0cfbce24e0354b5a479a3a59167be756be28d7da94eeab23794f3"} Oct 06 12:18:57 crc kubenswrapper[4892]: I1006 12:18:57.395439 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3d3ebe6a3b0cfbce24e0354b5a479a3a59167be756be28d7da94eeab23794f3" Oct 06 12:18:57 crc kubenswrapper[4892]: I1006 12:18:57.395131 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.188505 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv"] Oct 06 12:19:07 crc kubenswrapper[4892]: E1006 12:19:07.189298 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerName="extract" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.189315 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerName="extract" Oct 06 12:19:07 crc kubenswrapper[4892]: E1006 12:19:07.189357 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerName="pull" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.189366 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerName="pull" Oct 06 12:19:07 crc kubenswrapper[4892]: E1006 12:19:07.189379 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerName="util" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.189386 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerName="util" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.189508 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24" containerName="extract" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.189955 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.192583 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-q4pwd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.192687 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.194603 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.204861 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.205519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc47j\" (UniqueName: \"kubernetes.io/projected/aae867b4-2097-459f-a413-0ead7e4478ce-kube-api-access-lc47j\") pod \"obo-prometheus-operator-7c8cf85677-8fxfv\" (UID: \"aae867b4-2097-459f-a413-0ead7e4478ce\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.307033 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc47j\" (UniqueName: \"kubernetes.io/projected/aae867b4-2097-459f-a413-0ead7e4478ce-kube-api-access-lc47j\") pod \"obo-prometheus-operator-7c8cf85677-8fxfv\" (UID: \"aae867b4-2097-459f-a413-0ead7e4478ce\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.309333 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.310095 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.313110 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.313253 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-bvclq" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.324925 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.325690 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.330009 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc47j\" (UniqueName: \"kubernetes.io/projected/aae867b4-2097-459f-a413-0ead7e4478ce-kube-api-access-lc47j\") pod \"obo-prometheus-operator-7c8cf85677-8fxfv\" (UID: \"aae867b4-2097-459f-a413-0ead7e4478ce\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.341674 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.344722 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.505216 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.514997 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a0093d1-7a06-4147-816d-4bc7a73c505d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx\" (UID: \"6a0093d1-7a06-4147-816d-4bc7a73c505d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.515085 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a0093d1-7a06-4147-816d-4bc7a73c505d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx\" (UID: \"6a0093d1-7a06-4147-816d-4bc7a73c505d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.515204 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd\" (UID: \"47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.515256 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd\" (UID: \"47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.529827 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-4dkvd"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.531131 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.533821 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-bd7km" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.544417 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.547620 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-4dkvd"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.619153 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd\" (UID: \"47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.619486 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd\" (UID: \"47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.619516 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f55dq\" (UniqueName: \"kubernetes.io/projected/178f722d-bf92-4584-b0cf-9550c41b3153-kube-api-access-f55dq\") pod \"observability-operator-cc5f78dfc-4dkvd\" (UID: \"178f722d-bf92-4584-b0cf-9550c41b3153\") " pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.619547 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a0093d1-7a06-4147-816d-4bc7a73c505d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx\" (UID: \"6a0093d1-7a06-4147-816d-4bc7a73c505d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.619577 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a0093d1-7a06-4147-816d-4bc7a73c505d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx\" (UID: \"6a0093d1-7a06-4147-816d-4bc7a73c505d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.619601 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/178f722d-bf92-4584-b0cf-9550c41b3153-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-4dkvd\" (UID: \"178f722d-bf92-4584-b0cf-9550c41b3153\") " pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.629062 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd\" (UID: \"47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.629113 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6a0093d1-7a06-4147-816d-4bc7a73c505d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx\" (UID: \"6a0093d1-7a06-4147-816d-4bc7a73c505d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.629906 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd\" (UID: \"47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.629965 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6a0093d1-7a06-4147-816d-4bc7a73c505d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx\" (UID: \"6a0093d1-7a06-4147-816d-4bc7a73c505d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.660210 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.716261 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-gtbwr"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.716956 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.720103 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f55dq\" (UniqueName: \"kubernetes.io/projected/178f722d-bf92-4584-b0cf-9550c41b3153-kube-api-access-f55dq\") pod \"observability-operator-cc5f78dfc-4dkvd\" (UID: \"178f722d-bf92-4584-b0cf-9550c41b3153\") " pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.720169 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/aef96bb9-192a-4934-b46d-2e2cea0ac97e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-gtbwr\" (UID: \"aef96bb9-192a-4934-b46d-2e2cea0ac97e\") " pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.720212 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/178f722d-bf92-4584-b0cf-9550c41b3153-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-4dkvd\" (UID: \"178f722d-bf92-4584-b0cf-9550c41b3153\") " pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.720263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9fxx\" (UniqueName: \"kubernetes.io/projected/aef96bb9-192a-4934-b46d-2e2cea0ac97e-kube-api-access-r9fxx\") pod \"perses-operator-54bc95c9fb-gtbwr\" (UID: \"aef96bb9-192a-4934-b46d-2e2cea0ac97e\") " pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.720385 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-gtbwr"] Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.724888 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gt9qt" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.725030 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/178f722d-bf92-4584-b0cf-9550c41b3153-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-4dkvd\" (UID: \"178f722d-bf92-4584-b0cf-9550c41b3153\") " pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.751014 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f55dq\" (UniqueName: \"kubernetes.io/projected/178f722d-bf92-4584-b0cf-9550c41b3153-kube-api-access-f55dq\") pod \"observability-operator-cc5f78dfc-4dkvd\" (UID: \"178f722d-bf92-4584-b0cf-9550c41b3153\") " pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.821046 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9fxx\" (UniqueName: \"kubernetes.io/projected/aef96bb9-192a-4934-b46d-2e2cea0ac97e-kube-api-access-r9fxx\") pod \"perses-operator-54bc95c9fb-gtbwr\" (UID: \"aef96bb9-192a-4934-b46d-2e2cea0ac97e\") " pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.821124 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/aef96bb9-192a-4934-b46d-2e2cea0ac97e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-gtbwr\" (UID: \"aef96bb9-192a-4934-b46d-2e2cea0ac97e\") " pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.822012 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/aef96bb9-192a-4934-b46d-2e2cea0ac97e-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-gtbwr\" (UID: \"aef96bb9-192a-4934-b46d-2e2cea0ac97e\") " pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.841692 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9fxx\" (UniqueName: \"kubernetes.io/projected/aef96bb9-192a-4934-b46d-2e2cea0ac97e-kube-api-access-r9fxx\") pod \"perses-operator-54bc95c9fb-gtbwr\" (UID: \"aef96bb9-192a-4934-b46d-2e2cea0ac97e\") " pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.887780 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.923167 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" Oct 06 12:19:07 crc kubenswrapper[4892]: I1006 12:19:07.926999 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv"] Oct 06 12:19:07 crc kubenswrapper[4892]: W1006 12:19:07.939635 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae867b4_2097_459f_a413_0ead7e4478ce.slice/crio-cac2a924b69b2eaa72dfe03924474b0780921129a925535ec3625d349816b678 WatchSource:0}: Error finding container cac2a924b69b2eaa72dfe03924474b0780921129a925535ec3625d349816b678: Status 404 returned error can't find the container with id cac2a924b69b2eaa72dfe03924474b0780921129a925535ec3625d349816b678 Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.039381 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.097659 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-4dkvd"] Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.117162 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx"] Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.396217 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd"] Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.446775 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-gtbwr"] Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.457621 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" event={"ID":"178f722d-bf92-4584-b0cf-9550c41b3153","Type":"ContainerStarted","Data":"81ff469cb012344e65e5748d3e804952fd36c505abb2df6f6a50a2d1674d2e58"} Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.460116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" event={"ID":"6a0093d1-7a06-4147-816d-4bc7a73c505d","Type":"ContainerStarted","Data":"04dca985d9c3449a08909a3ef0f376cafbf1e17bfbae237895b82c6228a73fc8"} Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.461260 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv" event={"ID":"aae867b4-2097-459f-a413-0ead7e4478ce","Type":"ContainerStarted","Data":"cac2a924b69b2eaa72dfe03924474b0780921129a925535ec3625d349816b678"} Oct 06 12:19:08 crc kubenswrapper[4892]: I1006 12:19:08.462760 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" event={"ID":"47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6","Type":"ContainerStarted","Data":"5c873cf029b7466e5e41460d613b22059ba788923d23de9ef85c66d00f7d7ef6"} Oct 06 12:19:09 crc kubenswrapper[4892]: I1006 12:19:09.477936 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" event={"ID":"aef96bb9-192a-4934-b46d-2e2cea0ac97e","Type":"ContainerStarted","Data":"4d35e5b205152b845f780644fa0c83fd38241362dd3d740a9e43cd38ff4ab159"} Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.564512 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" event={"ID":"aef96bb9-192a-4934-b46d-2e2cea0ac97e","Type":"ContainerStarted","Data":"3d84889d32718f51e2243540ce6757c605fa7481c04400fad9c83ed7af2f8631"} Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.565923 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.568109 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" event={"ID":"178f722d-bf92-4584-b0cf-9550c41b3153","Type":"ContainerStarted","Data":"8317ce0c10accacaa9af1468f4090c0a5be8f5283eedf3cc51511588b0401e00"} Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.568820 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.570036 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" event={"ID":"6a0093d1-7a06-4147-816d-4bc7a73c505d","Type":"ContainerStarted","Data":"505ac49e944a55d102c2c6460537753f4c1e38f698fc23294fb93e44fb113af9"} Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.575291 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv" event={"ID":"aae867b4-2097-459f-a413-0ead7e4478ce","Type":"ContainerStarted","Data":"e59bed4ba5ae5133b62336b936823c22ab34d88e13720299b3f837d024a7e932"} Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.577562 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" event={"ID":"47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6","Type":"ContainerStarted","Data":"f6f1ec8e969d5168114ded560f2b16e4bb305094b33e69f60c18a885b2b36aeb"} Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.583730 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.584958 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" podStartSLOduration=1.8939275759999998 podStartE2EDuration="14.584941668s" podCreationTimestamp="2025-10-06 12:19:07 +0000 UTC" firstStartedPulling="2025-10-06 12:19:08.457126823 +0000 UTC m=+635.006832588" lastFinishedPulling="2025-10-06 12:19:21.148140875 +0000 UTC m=+647.697846680" observedRunningTime="2025-10-06 12:19:21.581637857 +0000 UTC m=+648.131343612" watchObservedRunningTime="2025-10-06 12:19:21.584941668 +0000 UTC m=+648.134647453" Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.617120 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-4dkvd" podStartSLOduration=1.541403356 podStartE2EDuration="14.617102909s" podCreationTimestamp="2025-10-06 12:19:07 +0000 UTC" firstStartedPulling="2025-10-06 12:19:08.1156854 +0000 UTC m=+634.665391165" lastFinishedPulling="2025-10-06 12:19:21.191384953 +0000 UTC m=+647.741090718" observedRunningTime="2025-10-06 12:19:21.613155559 +0000 UTC m=+648.162861324" watchObservedRunningTime="2025-10-06 12:19:21.617102909 +0000 UTC m=+648.166808674" Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.636061 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx" podStartSLOduration=1.6542776799999999 podStartE2EDuration="14.636046953s" podCreationTimestamp="2025-10-06 12:19:07 +0000 UTC" firstStartedPulling="2025-10-06 12:19:08.145830334 +0000 UTC m=+634.695536099" lastFinishedPulling="2025-10-06 12:19:21.127599607 +0000 UTC m=+647.677305372" observedRunningTime="2025-10-06 12:19:21.632412172 +0000 UTC m=+648.182117937" watchObservedRunningTime="2025-10-06 12:19:21.636046953 +0000 UTC m=+648.185752718" Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.656834 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-8fxfv" podStartSLOduration=1.435783022 podStartE2EDuration="14.656819708s" podCreationTimestamp="2025-10-06 12:19:07 +0000 UTC" firstStartedPulling="2025-10-06 12:19:07.941620381 +0000 UTC m=+634.491326146" lastFinishedPulling="2025-10-06 12:19:21.162657077 +0000 UTC m=+647.712362832" observedRunningTime="2025-10-06 12:19:21.653902807 +0000 UTC m=+648.203608572" watchObservedRunningTime="2025-10-06 12:19:21.656819708 +0000 UTC m=+648.206525473" Oct 06 12:19:21 crc kubenswrapper[4892]: I1006 12:19:21.680463 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd" podStartSLOduration=1.963889352 podStartE2EDuration="14.680450792s" podCreationTimestamp="2025-10-06 12:19:07 +0000 UTC" firstStartedPulling="2025-10-06 12:19:08.430398683 +0000 UTC m=+634.980104448" lastFinishedPulling="2025-10-06 12:19:21.146960083 +0000 UTC m=+647.696665888" observedRunningTime="2025-10-06 12:19:21.676873743 +0000 UTC m=+648.226579508" watchObservedRunningTime="2025-10-06 12:19:21.680450792 +0000 UTC m=+648.230156557" Oct 06 12:19:22 crc kubenswrapper[4892]: I1006 12:19:22.984290 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:19:22 crc kubenswrapper[4892]: I1006 12:19:22.984354 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:19:22 crc kubenswrapper[4892]: I1006 12:19:22.984394 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:19:22 crc kubenswrapper[4892]: I1006 12:19:22.985063 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e79f1d5ccf3f44bb99369047594e47c87713adc95a8cf367cf3aea501eded4f0"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:19:22 crc kubenswrapper[4892]: I1006 12:19:22.985103 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://e79f1d5ccf3f44bb99369047594e47c87713adc95a8cf367cf3aea501eded4f0" gracePeriod=600 Oct 06 12:19:23 crc kubenswrapper[4892]: I1006 12:19:23.594734 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="e79f1d5ccf3f44bb99369047594e47c87713adc95a8cf367cf3aea501eded4f0" exitCode=0 Oct 06 12:19:23 crc kubenswrapper[4892]: I1006 12:19:23.594818 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"e79f1d5ccf3f44bb99369047594e47c87713adc95a8cf367cf3aea501eded4f0"} Oct 06 12:19:23 crc kubenswrapper[4892]: I1006 12:19:23.595363 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"65a23fa133935013fcfc189b72a8929bb8d601f4fefb20b891097d8ee152e268"} Oct 06 12:19:23 crc kubenswrapper[4892]: I1006 12:19:23.595397 4892 scope.go:117] "RemoveContainer" containerID="fe252163f7a2babdff1e10cc57f09f2bd93ebf81d17649e7aa36213a49197603" Oct 06 12:19:28 crc kubenswrapper[4892]: I1006 12:19:28.043000 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-gtbwr" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.660675 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd"] Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.662520 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.671132 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.678581 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd"] Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.698063 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.698127 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4stls\" (UniqueName: \"kubernetes.io/projected/103c43a1-da8f-47d5-a72d-2f97e8bbae27-kube-api-access-4stls\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.698458 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.799380 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.799468 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4stls\" (UniqueName: \"kubernetes.io/projected/103c43a1-da8f-47d5-a72d-2f97e8bbae27-kube-api-access-4stls\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.799494 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.800513 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.800627 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.835257 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4stls\" (UniqueName: \"kubernetes.io/projected/103c43a1-da8f-47d5-a72d-2f97e8bbae27-kube-api-access-4stls\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:45 crc kubenswrapper[4892]: I1006 12:19:45.991488 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:46 crc kubenswrapper[4892]: I1006 12:19:46.447242 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd"] Oct 06 12:19:46 crc kubenswrapper[4892]: W1006 12:19:46.460882 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod103c43a1_da8f_47d5_a72d_2f97e8bbae27.slice/crio-9191e6ae328e488aee8ccad232dcce2448bddcbfdeb19bb2e6ee8d90c4a9cc85 WatchSource:0}: Error finding container 9191e6ae328e488aee8ccad232dcce2448bddcbfdeb19bb2e6ee8d90c4a9cc85: Status 404 returned error can't find the container with id 9191e6ae328e488aee8ccad232dcce2448bddcbfdeb19bb2e6ee8d90c4a9cc85 Oct 06 12:19:46 crc kubenswrapper[4892]: I1006 12:19:46.753644 4892 generic.go:334] "Generic (PLEG): container finished" podID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerID="ca411b259bbdb9091777f80a63f1f2fabb322b19b75073f15725d96751c78534" exitCode=0 Oct 06 12:19:46 crc kubenswrapper[4892]: I1006 12:19:46.753710 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" event={"ID":"103c43a1-da8f-47d5-a72d-2f97e8bbae27","Type":"ContainerDied","Data":"ca411b259bbdb9091777f80a63f1f2fabb322b19b75073f15725d96751c78534"} Oct 06 12:19:46 crc kubenswrapper[4892]: I1006 12:19:46.753749 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" event={"ID":"103c43a1-da8f-47d5-a72d-2f97e8bbae27","Type":"ContainerStarted","Data":"9191e6ae328e488aee8ccad232dcce2448bddcbfdeb19bb2e6ee8d90c4a9cc85"} Oct 06 12:19:48 crc kubenswrapper[4892]: I1006 12:19:48.770488 4892 generic.go:334] "Generic (PLEG): container finished" podID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerID="e5c7debf82aa9204e8dc3f6a56dce3aeff13a7e95d72b6f28a2a8524b3ecfe35" exitCode=0 Oct 06 12:19:48 crc kubenswrapper[4892]: I1006 12:19:48.770569 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" event={"ID":"103c43a1-da8f-47d5-a72d-2f97e8bbae27","Type":"ContainerDied","Data":"e5c7debf82aa9204e8dc3f6a56dce3aeff13a7e95d72b6f28a2a8524b3ecfe35"} Oct 06 12:19:51 crc kubenswrapper[4892]: I1006 12:19:51.794217 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" event={"ID":"103c43a1-da8f-47d5-a72d-2f97e8bbae27","Type":"ContainerStarted","Data":"64052fc27f415bfef7d5499de7e6119dfcb5a44c65ad35972e13ae74c171984d"} Oct 06 12:19:52 crc kubenswrapper[4892]: I1006 12:19:52.804510 4892 generic.go:334] "Generic (PLEG): container finished" podID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerID="64052fc27f415bfef7d5499de7e6119dfcb5a44c65ad35972e13ae74c171984d" exitCode=0 Oct 06 12:19:52 crc kubenswrapper[4892]: I1006 12:19:52.804567 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" event={"ID":"103c43a1-da8f-47d5-a72d-2f97e8bbae27","Type":"ContainerDied","Data":"64052fc27f415bfef7d5499de7e6119dfcb5a44c65ad35972e13ae74c171984d"} Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.185390 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.219775 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4stls\" (UniqueName: \"kubernetes.io/projected/103c43a1-da8f-47d5-a72d-2f97e8bbae27-kube-api-access-4stls\") pod \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.219898 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-bundle\") pod \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.220048 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-util\") pod \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\" (UID: \"103c43a1-da8f-47d5-a72d-2f97e8bbae27\") " Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.223431 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-bundle" (OuterVolumeSpecName: "bundle") pod "103c43a1-da8f-47d5-a72d-2f97e8bbae27" (UID: "103c43a1-da8f-47d5-a72d-2f97e8bbae27"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.227162 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103c43a1-da8f-47d5-a72d-2f97e8bbae27-kube-api-access-4stls" (OuterVolumeSpecName: "kube-api-access-4stls") pod "103c43a1-da8f-47d5-a72d-2f97e8bbae27" (UID: "103c43a1-da8f-47d5-a72d-2f97e8bbae27"). InnerVolumeSpecName "kube-api-access-4stls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.236762 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-util" (OuterVolumeSpecName: "util") pod "103c43a1-da8f-47d5-a72d-2f97e8bbae27" (UID: "103c43a1-da8f-47d5-a72d-2f97e8bbae27"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.321490 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4stls\" (UniqueName: \"kubernetes.io/projected/103c43a1-da8f-47d5-a72d-2f97e8bbae27-kube-api-access-4stls\") on node \"crc\" DevicePath \"\"" Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.321523 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.321531 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/103c43a1-da8f-47d5-a72d-2f97e8bbae27-util\") on node \"crc\" DevicePath \"\"" Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.824589 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" event={"ID":"103c43a1-da8f-47d5-a72d-2f97e8bbae27","Type":"ContainerDied","Data":"9191e6ae328e488aee8ccad232dcce2448bddcbfdeb19bb2e6ee8d90c4a9cc85"} Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.824668 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd" Oct 06 12:19:54 crc kubenswrapper[4892]: I1006 12:19:54.824683 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9191e6ae328e488aee8ccad232dcce2448bddcbfdeb19bb2e6ee8d90c4a9cc85" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.248317 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn"] Oct 06 12:19:57 crc kubenswrapper[4892]: E1006 12:19:57.249262 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerName="util" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.249283 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerName="util" Oct 06 12:19:57 crc kubenswrapper[4892]: E1006 12:19:57.249302 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerName="pull" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.249313 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerName="pull" Oct 06 12:19:57 crc kubenswrapper[4892]: E1006 12:19:57.249366 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerName="extract" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.249379 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerName="extract" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.249540 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="103c43a1-da8f-47d5-a72d-2f97e8bbae27" containerName="extract" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.250141 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.252162 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.252435 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-l87jt" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.252439 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.261416 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn"] Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.362151 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95jft\" (UniqueName: \"kubernetes.io/projected/3912b44b-2305-4f14-8b86-5a5208df2442-kube-api-access-95jft\") pod \"nmstate-operator-858ddd8f98-xvhkn\" (UID: \"3912b44b-2305-4f14-8b86-5a5208df2442\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.463626 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95jft\" (UniqueName: \"kubernetes.io/projected/3912b44b-2305-4f14-8b86-5a5208df2442-kube-api-access-95jft\") pod \"nmstate-operator-858ddd8f98-xvhkn\" (UID: \"3912b44b-2305-4f14-8b86-5a5208df2442\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.481697 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95jft\" (UniqueName: \"kubernetes.io/projected/3912b44b-2305-4f14-8b86-5a5208df2442-kube-api-access-95jft\") pod \"nmstate-operator-858ddd8f98-xvhkn\" (UID: \"3912b44b-2305-4f14-8b86-5a5208df2442\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.564932 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn" Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.820405 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn"] Oct 06 12:19:57 crc kubenswrapper[4892]: I1006 12:19:57.848613 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn" event={"ID":"3912b44b-2305-4f14-8b86-5a5208df2442","Type":"ContainerStarted","Data":"e0bd948ff7a2ea7b5b2adf390daa4e11c25326fcfb3adb88e182456b4d546fc0"} Oct 06 12:20:00 crc kubenswrapper[4892]: I1006 12:20:00.870348 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn" event={"ID":"3912b44b-2305-4f14-8b86-5a5208df2442","Type":"ContainerStarted","Data":"2ba574ed50cca7dcb3779f686a62a268c285bcfa824f6a983d9df7b300499e6b"} Oct 06 12:20:00 crc kubenswrapper[4892]: I1006 12:20:00.886688 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-xvhkn" podStartSLOduration=1.7488531109999998 podStartE2EDuration="3.886670635s" podCreationTimestamp="2025-10-06 12:19:57 +0000 UTC" firstStartedPulling="2025-10-06 12:19:57.833607404 +0000 UTC m=+684.383313159" lastFinishedPulling="2025-10-06 12:19:59.971424918 +0000 UTC m=+686.521130683" observedRunningTime="2025-10-06 12:20:00.885664861 +0000 UTC m=+687.435370666" watchObservedRunningTime="2025-10-06 12:20:00.886670635 +0000 UTC m=+687.436376410" Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.875799 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh"] Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.877484 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.880174 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-rsjf8" Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.882854 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58"] Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.883940 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.902108 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58"] Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.905125 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8hgsn"] Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.905768 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.912005 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh"] Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.919819 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4885l\" (UniqueName: \"kubernetes.io/projected/b33f5260-a8fd-4654-9a8c-49e30ed7857d-kube-api-access-4885l\") pod \"nmstate-metrics-fdff9cb8d-hrjsh\" (UID: \"b33f5260-a8fd-4654-9a8c-49e30ed7857d\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.920073 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8js8\" (UniqueName: \"kubernetes.io/projected/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-kube-api-access-v8js8\") pod \"nmstate-webhook-6cdbc54649-bqk58\" (UID: \"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.920171 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bqk58\" (UID: \"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:06 crc kubenswrapper[4892]: I1006 12:20:06.921672 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.014871 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s"] Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.015707 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.017387 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.018737 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.018753 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-bkw9s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.020984 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-nmstate-lock\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.021026 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8js8\" (UniqueName: \"kubernetes.io/projected/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-kube-api-access-v8js8\") pod \"nmstate-webhook-6cdbc54649-bqk58\" (UID: \"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.021051 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bqk58\" (UID: \"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.021099 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxfv4\" (UniqueName: \"kubernetes.io/projected/32fdd543-59ce-4feb-bd9f-9d804de6f71a-kube-api-access-cxfv4\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: E1006 12:20:07.021166 4892 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 06 12:20:07 crc kubenswrapper[4892]: E1006 12:20:07.021204 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-tls-key-pair podName:f6c46ff4-2ed1-4acc-bae3-05a8db533ed7 nodeName:}" failed. No retries permitted until 2025-10-06 12:20:07.521188823 +0000 UTC m=+694.070894588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-tls-key-pair") pod "nmstate-webhook-6cdbc54649-bqk58" (UID: "f6c46ff4-2ed1-4acc-bae3-05a8db533ed7") : secret "openshift-nmstate-webhook" not found Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.021236 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4885l\" (UniqueName: \"kubernetes.io/projected/b33f5260-a8fd-4654-9a8c-49e30ed7857d-kube-api-access-4885l\") pod \"nmstate-metrics-fdff9cb8d-hrjsh\" (UID: \"b33f5260-a8fd-4654-9a8c-49e30ed7857d\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.021278 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-ovs-socket\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.021304 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-dbus-socket\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.028098 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s"] Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.045862 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8js8\" (UniqueName: \"kubernetes.io/projected/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-kube-api-access-v8js8\") pod \"nmstate-webhook-6cdbc54649-bqk58\" (UID: \"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.048561 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4885l\" (UniqueName: \"kubernetes.io/projected/b33f5260-a8fd-4654-9a8c-49e30ed7857d-kube-api-access-4885l\") pod \"nmstate-metrics-fdff9cb8d-hrjsh\" (UID: \"b33f5260-a8fd-4654-9a8c-49e30ed7857d\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.122189 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqf47\" (UniqueName: \"kubernetes.io/projected/175bab6c-5586-459d-b101-2ca420eb7885-kube-api-access-nqf47\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.122272 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/175bab6c-5586-459d-b101-2ca420eb7885-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.122309 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-ovs-socket\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.122366 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-dbus-socket\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.122412 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/175bab6c-5586-459d-b101-2ca420eb7885-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.122459 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-nmstate-lock\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.122528 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxfv4\" (UniqueName: \"kubernetes.io/projected/32fdd543-59ce-4feb-bd9f-9d804de6f71a-kube-api-access-cxfv4\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.125648 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-dbus-socket\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.125808 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-nmstate-lock\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.122453 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/32fdd543-59ce-4feb-bd9f-9d804de6f71a-ovs-socket\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.167512 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxfv4\" (UniqueName: \"kubernetes.io/projected/32fdd543-59ce-4feb-bd9f-9d804de6f71a-kube-api-access-cxfv4\") pod \"nmstate-handler-8hgsn\" (UID: \"32fdd543-59ce-4feb-bd9f-9d804de6f71a\") " pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.208340 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-76f6957645-94qwb"] Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.209250 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.221286 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76f6957645-94qwb"] Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.223986 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/175bab6c-5586-459d-b101-2ca420eb7885-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.224390 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqf47\" (UniqueName: \"kubernetes.io/projected/175bab6c-5586-459d-b101-2ca420eb7885-kube-api-access-nqf47\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.224469 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/175bab6c-5586-459d-b101-2ca420eb7885-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.226397 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/175bab6c-5586-459d-b101-2ca420eb7885-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.240903 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/175bab6c-5586-459d-b101-2ca420eb7885-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.249890 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.257437 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqf47\" (UniqueName: \"kubernetes.io/projected/175bab6c-5586-459d-b101-2ca420eb7885-kube-api-access-nqf47\") pod \"nmstate-console-plugin-6b874cbd85-6js6s\" (UID: \"175bab6c-5586-459d-b101-2ca420eb7885\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.265267 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.326633 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-config\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.327073 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-oauth-config\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.327110 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-trusted-ca-bundle\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.327134 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-oauth-serving-cert\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.327224 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fkpx\" (UniqueName: \"kubernetes.io/projected/f6851bc6-27b3-4f4b-b60f-d49784bac6db-kube-api-access-9fkpx\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.327274 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-service-ca\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.327315 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-serving-cert\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.333356 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.428623 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-oauth-config\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.428709 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-trusted-ca-bundle\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.428732 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-oauth-serving-cert\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.428847 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fkpx\" (UniqueName: \"kubernetes.io/projected/f6851bc6-27b3-4f4b-b60f-d49784bac6db-kube-api-access-9fkpx\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.428892 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-service-ca\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.428951 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-serving-cert\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.429024 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-config\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.430092 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-config\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.430493 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-service-ca\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.430630 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-oauth-serving-cert\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.430909 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6851bc6-27b3-4f4b-b60f-d49784bac6db-trusted-ca-bundle\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.431653 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh"] Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.433720 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-oauth-config\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.435585 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6851bc6-27b3-4f4b-b60f-d49784bac6db-console-serving-cert\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: W1006 12:20:07.436551 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb33f5260_a8fd_4654_9a8c_49e30ed7857d.slice/crio-3ebbc85646a7c2b75a1c79bca310b858870a7c5cb064b52432e493fcc840e0ec WatchSource:0}: Error finding container 3ebbc85646a7c2b75a1c79bca310b858870a7c5cb064b52432e493fcc840e0ec: Status 404 returned error can't find the container with id 3ebbc85646a7c2b75a1c79bca310b858870a7c5cb064b52432e493fcc840e0ec Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.449379 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fkpx\" (UniqueName: \"kubernetes.io/projected/f6851bc6-27b3-4f4b-b60f-d49784bac6db-kube-api-access-9fkpx\") pod \"console-76f6957645-94qwb\" (UID: \"f6851bc6-27b3-4f4b-b60f-d49784bac6db\") " pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.523245 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.530146 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bqk58\" (UID: \"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.533769 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f6c46ff4-2ed1-4acc-bae3-05a8db533ed7-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-bqk58\" (UID: \"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.560189 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.720774 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s"] Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.929924 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" event={"ID":"b33f5260-a8fd-4654-9a8c-49e30ed7857d","Type":"ContainerStarted","Data":"3ebbc85646a7c2b75a1c79bca310b858870a7c5cb064b52432e493fcc840e0ec"} Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.931115 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8hgsn" event={"ID":"32fdd543-59ce-4feb-bd9f-9d804de6f71a","Type":"ContainerStarted","Data":"beedb2b827d68d03cef47011618ca4947db8b0a8d79ff5a8a706a47c4f0eefb3"} Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.932685 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" event={"ID":"175bab6c-5586-459d-b101-2ca420eb7885","Type":"ContainerStarted","Data":"37ac8223e879f1b64fe377fe7c31f01ea1a6e69a117f8b5095f8de024d44d073"} Oct 06 12:20:07 crc kubenswrapper[4892]: I1006 12:20:07.948224 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76f6957645-94qwb"] Oct 06 12:20:08 crc kubenswrapper[4892]: I1006 12:20:08.025342 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58"] Oct 06 12:20:08 crc kubenswrapper[4892]: W1006 12:20:08.033767 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c46ff4_2ed1_4acc_bae3_05a8db533ed7.slice/crio-b880d742c5922dcf080cca14d53999a8c40a7311d5d1397afd44bf0f5374d464 WatchSource:0}: Error finding container b880d742c5922dcf080cca14d53999a8c40a7311d5d1397afd44bf0f5374d464: Status 404 returned error can't find the container with id b880d742c5922dcf080cca14d53999a8c40a7311d5d1397afd44bf0f5374d464 Oct 06 12:20:08 crc kubenswrapper[4892]: I1006 12:20:08.938098 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" event={"ID":"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7","Type":"ContainerStarted","Data":"b880d742c5922dcf080cca14d53999a8c40a7311d5d1397afd44bf0f5374d464"} Oct 06 12:20:08 crc kubenswrapper[4892]: I1006 12:20:08.939594 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f6957645-94qwb" event={"ID":"f6851bc6-27b3-4f4b-b60f-d49784bac6db","Type":"ContainerStarted","Data":"e7ec44330ba3f73958f92e0c8571f322a0fb080e184b2ae44f509c983734190d"} Oct 06 12:20:08 crc kubenswrapper[4892]: I1006 12:20:08.939634 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76f6957645-94qwb" event={"ID":"f6851bc6-27b3-4f4b-b60f-d49784bac6db","Type":"ContainerStarted","Data":"b441151cb2011c2bedb5dd763b0af3c6ac80096e863ae428048da2ff3aae3f4d"} Oct 06 12:20:08 crc kubenswrapper[4892]: I1006 12:20:08.962192 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76f6957645-94qwb" podStartSLOduration=1.962168572 podStartE2EDuration="1.962168572s" podCreationTimestamp="2025-10-06 12:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:20:08.955876596 +0000 UTC m=+695.505582361" watchObservedRunningTime="2025-10-06 12:20:08.962168572 +0000 UTC m=+695.511874377" Oct 06 12:20:09 crc kubenswrapper[4892]: I1006 12:20:09.947398 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" event={"ID":"b33f5260-a8fd-4654-9a8c-49e30ed7857d","Type":"ContainerStarted","Data":"29318548b810d6d3ba3a80f1fe651ca56c84ab334915b779ef076d4172bf845e"} Oct 06 12:20:09 crc kubenswrapper[4892]: I1006 12:20:09.948722 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8hgsn" event={"ID":"32fdd543-59ce-4feb-bd9f-9d804de6f71a","Type":"ContainerStarted","Data":"9c28c24ae87ddbff902f864f47c5b9c9e1365c5cb4e906347f95c12dc90814ea"} Oct 06 12:20:09 crc kubenswrapper[4892]: I1006 12:20:09.948770 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:09 crc kubenswrapper[4892]: I1006 12:20:09.950790 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" event={"ID":"f6c46ff4-2ed1-4acc-bae3-05a8db533ed7","Type":"ContainerStarted","Data":"7bb7f30d503a28797bc1dbbc222a120b11ad14588633c678652651c874c4a8ce"} Oct 06 12:20:09 crc kubenswrapper[4892]: I1006 12:20:09.950837 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:09 crc kubenswrapper[4892]: I1006 12:20:09.967580 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8hgsn" podStartSLOduration=1.6931893900000001 podStartE2EDuration="3.967561198s" podCreationTimestamp="2025-10-06 12:20:06 +0000 UTC" firstStartedPulling="2025-10-06 12:20:07.2870815 +0000 UTC m=+693.836787265" lastFinishedPulling="2025-10-06 12:20:09.561453308 +0000 UTC m=+696.111159073" observedRunningTime="2025-10-06 12:20:09.963029806 +0000 UTC m=+696.512735591" watchObservedRunningTime="2025-10-06 12:20:09.967561198 +0000 UTC m=+696.517266963" Oct 06 12:20:09 crc kubenswrapper[4892]: I1006 12:20:09.996156 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" podStartSLOduration=2.488630442 podStartE2EDuration="3.996132978s" podCreationTimestamp="2025-10-06 12:20:06 +0000 UTC" firstStartedPulling="2025-10-06 12:20:08.035981967 +0000 UTC m=+694.585687732" lastFinishedPulling="2025-10-06 12:20:09.543484503 +0000 UTC m=+696.093190268" observedRunningTime="2025-10-06 12:20:09.976693753 +0000 UTC m=+696.526399518" watchObservedRunningTime="2025-10-06 12:20:09.996132978 +0000 UTC m=+696.545838743" Oct 06 12:20:10 crc kubenswrapper[4892]: I1006 12:20:10.961626 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" event={"ID":"175bab6c-5586-459d-b101-2ca420eb7885","Type":"ContainerStarted","Data":"cf70d222b8b753f388f3a97290f88e4a74e1cb5a63ea224250107df361f1aff6"} Oct 06 12:20:10 crc kubenswrapper[4892]: I1006 12:20:10.989836 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-6js6s" podStartSLOduration=2.321425731 podStartE2EDuration="4.989796183s" podCreationTimestamp="2025-10-06 12:20:06 +0000 UTC" firstStartedPulling="2025-10-06 12:20:07.732676959 +0000 UTC m=+694.282382724" lastFinishedPulling="2025-10-06 12:20:10.401047391 +0000 UTC m=+696.950753176" observedRunningTime="2025-10-06 12:20:10.989454958 +0000 UTC m=+697.539160733" watchObservedRunningTime="2025-10-06 12:20:10.989796183 +0000 UTC m=+697.539501958" Oct 06 12:20:12 crc kubenswrapper[4892]: I1006 12:20:12.975793 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" event={"ID":"b33f5260-a8fd-4654-9a8c-49e30ed7857d","Type":"ContainerStarted","Data":"0094cca6cbe636a2a4abfc02b3c709b8fc6d0359ca8a0dd443c666f5384bf56c"} Oct 06 12:20:13 crc kubenswrapper[4892]: I1006 12:20:13.002304 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-hrjsh" podStartSLOduration=2.550218222 podStartE2EDuration="7.002278278s" podCreationTimestamp="2025-10-06 12:20:06 +0000 UTC" firstStartedPulling="2025-10-06 12:20:07.445478901 +0000 UTC m=+693.995184666" lastFinishedPulling="2025-10-06 12:20:11.897538957 +0000 UTC m=+698.447244722" observedRunningTime="2025-10-06 12:20:13.000038708 +0000 UTC m=+699.549744533" watchObservedRunningTime="2025-10-06 12:20:13.002278278 +0000 UTC m=+699.551984083" Oct 06 12:20:17 crc kubenswrapper[4892]: I1006 12:20:17.302545 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8hgsn" Oct 06 12:20:17 crc kubenswrapper[4892]: I1006 12:20:17.524215 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:17 crc kubenswrapper[4892]: I1006 12:20:17.524678 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:17 crc kubenswrapper[4892]: I1006 12:20:17.531877 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:18 crc kubenswrapper[4892]: I1006 12:20:18.017235 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76f6957645-94qwb" Oct 06 12:20:18 crc kubenswrapper[4892]: I1006 12:20:18.096631 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g2wtm"] Oct 06 12:20:27 crc kubenswrapper[4892]: I1006 12:20:27.568842 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-bqk58" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.174853 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-g2wtm" podUID="d26efdd9-e946-418f-95a6-0100f0364b92" containerName="console" containerID="cri-o://8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76" gracePeriod=15 Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.665377 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g2wtm_d26efdd9-e946-418f-95a6-0100f0364b92/console/0.log" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.665687 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.840663 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-trusted-ca-bundle\") pod \"d26efdd9-e946-418f-95a6-0100f0364b92\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.840750 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-serving-cert\") pod \"d26efdd9-e946-418f-95a6-0100f0364b92\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.840831 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-oauth-serving-cert\") pod \"d26efdd9-e946-418f-95a6-0100f0364b92\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.840875 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q255f\" (UniqueName: \"kubernetes.io/projected/d26efdd9-e946-418f-95a6-0100f0364b92-kube-api-access-q255f\") pod \"d26efdd9-e946-418f-95a6-0100f0364b92\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.840962 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-console-config\") pod \"d26efdd9-e946-418f-95a6-0100f0364b92\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.841064 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-service-ca\") pod \"d26efdd9-e946-418f-95a6-0100f0364b92\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.841179 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-oauth-config\") pod \"d26efdd9-e946-418f-95a6-0100f0364b92\" (UID: \"d26efdd9-e946-418f-95a6-0100f0364b92\") " Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.841681 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-console-config" (OuterVolumeSpecName: "console-config") pod "d26efdd9-e946-418f-95a6-0100f0364b92" (UID: "d26efdd9-e946-418f-95a6-0100f0364b92"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.841692 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d26efdd9-e946-418f-95a6-0100f0364b92" (UID: "d26efdd9-e946-418f-95a6-0100f0364b92"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.841712 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-service-ca" (OuterVolumeSpecName: "service-ca") pod "d26efdd9-e946-418f-95a6-0100f0364b92" (UID: "d26efdd9-e946-418f-95a6-0100f0364b92"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.841874 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d26efdd9-e946-418f-95a6-0100f0364b92" (UID: "d26efdd9-e946-418f-95a6-0100f0364b92"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.847227 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d26efdd9-e946-418f-95a6-0100f0364b92" (UID: "d26efdd9-e946-418f-95a6-0100f0364b92"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.848348 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d26efdd9-e946-418f-95a6-0100f0364b92" (UID: "d26efdd9-e946-418f-95a6-0100f0364b92"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.853561 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26efdd9-e946-418f-95a6-0100f0364b92-kube-api-access-q255f" (OuterVolumeSpecName: "kube-api-access-q255f") pod "d26efdd9-e946-418f-95a6-0100f0364b92" (UID: "d26efdd9-e946-418f-95a6-0100f0364b92"). InnerVolumeSpecName "kube-api-access-q255f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.943070 4892 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.943567 4892 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.943579 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q255f\" (UniqueName: \"kubernetes.io/projected/d26efdd9-e946-418f-95a6-0100f0364b92-kube-api-access-q255f\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.943590 4892 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.943598 4892 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-console-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.943606 4892 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26efdd9-e946-418f-95a6-0100f0364b92-service-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:43 crc kubenswrapper[4892]: I1006 12:20:43.943614 4892 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26efdd9-e946-418f-95a6-0100f0364b92-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.207570 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g2wtm_d26efdd9-e946-418f-95a6-0100f0364b92/console/0.log" Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.207618 4892 generic.go:334] "Generic (PLEG): container finished" podID="d26efdd9-e946-418f-95a6-0100f0364b92" containerID="8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76" exitCode=2 Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.207648 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g2wtm" event={"ID":"d26efdd9-e946-418f-95a6-0100f0364b92","Type":"ContainerDied","Data":"8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76"} Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.207673 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g2wtm" event={"ID":"d26efdd9-e946-418f-95a6-0100f0364b92","Type":"ContainerDied","Data":"6a9cc930a2d45034665cfc9fab2c236e12427cad98218fbcd6a694cb2fbc93b9"} Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.207690 4892 scope.go:117] "RemoveContainer" containerID="8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76" Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.207733 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g2wtm" Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.232122 4892 scope.go:117] "RemoveContainer" containerID="8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76" Oct 06 12:20:44 crc kubenswrapper[4892]: E1006 12:20:44.232679 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76\": container with ID starting with 8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76 not found: ID does not exist" containerID="8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76" Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.232718 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76"} err="failed to get container status \"8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76\": rpc error: code = NotFound desc = could not find container \"8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76\": container with ID starting with 8ae1be18a7b690543a2ffaa8ff58e4da33c30a0b1aa302d96c97b8ba2f925c76 not found: ID does not exist" Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.235601 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g2wtm"] Oct 06 12:20:44 crc kubenswrapper[4892]: I1006 12:20:44.241595 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-g2wtm"] Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.086312 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp"] Oct 06 12:20:46 crc kubenswrapper[4892]: E1006 12:20:46.086687 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26efdd9-e946-418f-95a6-0100f0364b92" containerName="console" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.086707 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26efdd9-e946-418f-95a6-0100f0364b92" containerName="console" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.086887 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26efdd9-e946-418f-95a6-0100f0364b92" containerName="console" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.088237 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.090593 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.100852 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp"] Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.177650 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26efdd9-e946-418f-95a6-0100f0364b92" path="/var/lib/kubelet/pods/d26efdd9-e946-418f-95a6-0100f0364b92/volumes" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.271636 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.272694 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92dlw\" (UniqueName: \"kubernetes.io/projected/b00a505d-365a-4d52-b900-33073e3b4e84-kube-api-access-92dlw\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.272976 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.374541 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.374635 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92dlw\" (UniqueName: \"kubernetes.io/projected/b00a505d-365a-4d52-b900-33073e3b4e84-kube-api-access-92dlw\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.374719 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.375442 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.375506 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.405882 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92dlw\" (UniqueName: \"kubernetes.io/projected/b00a505d-365a-4d52-b900-33073e3b4e84-kube-api-access-92dlw\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.411218 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:46 crc kubenswrapper[4892]: I1006 12:20:46.919789 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp"] Oct 06 12:20:47 crc kubenswrapper[4892]: I1006 12:20:47.231827 4892 generic.go:334] "Generic (PLEG): container finished" podID="b00a505d-365a-4d52-b900-33073e3b4e84" containerID="4555d2e3a81a8a7ecc934908180794acfa66cfaea218896d29ee95a06ae823bc" exitCode=0 Oct 06 12:20:47 crc kubenswrapper[4892]: I1006 12:20:47.231868 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" event={"ID":"b00a505d-365a-4d52-b900-33073e3b4e84","Type":"ContainerDied","Data":"4555d2e3a81a8a7ecc934908180794acfa66cfaea218896d29ee95a06ae823bc"} Oct 06 12:20:47 crc kubenswrapper[4892]: I1006 12:20:47.231893 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" event={"ID":"b00a505d-365a-4d52-b900-33073e3b4e84","Type":"ContainerStarted","Data":"9581ab15629ec8062dd6e0057eb5287e5924dcb82d8767e3550a83605d248b89"} Oct 06 12:20:49 crc kubenswrapper[4892]: I1006 12:20:49.247941 4892 generic.go:334] "Generic (PLEG): container finished" podID="b00a505d-365a-4d52-b900-33073e3b4e84" containerID="16e3938ecab59e82185e06ae67be82228f3de702b8b0a5ed7b77ce0548f28309" exitCode=0 Oct 06 12:20:49 crc kubenswrapper[4892]: I1006 12:20:49.248309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" event={"ID":"b00a505d-365a-4d52-b900-33073e3b4e84","Type":"ContainerDied","Data":"16e3938ecab59e82185e06ae67be82228f3de702b8b0a5ed7b77ce0548f28309"} Oct 06 12:20:50 crc kubenswrapper[4892]: I1006 12:20:50.259641 4892 generic.go:334] "Generic (PLEG): container finished" podID="b00a505d-365a-4d52-b900-33073e3b4e84" containerID="419f3ce40dbcc55462cec9b4cbfe718c839323fef8d52a0629e4470b40b399ef" exitCode=0 Oct 06 12:20:50 crc kubenswrapper[4892]: I1006 12:20:50.259769 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" event={"ID":"b00a505d-365a-4d52-b900-33073e3b4e84","Type":"ContainerDied","Data":"419f3ce40dbcc55462cec9b4cbfe718c839323fef8d52a0629e4470b40b399ef"} Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.562883 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.749625 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-bundle\") pod \"b00a505d-365a-4d52-b900-33073e3b4e84\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.749682 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92dlw\" (UniqueName: \"kubernetes.io/projected/b00a505d-365a-4d52-b900-33073e3b4e84-kube-api-access-92dlw\") pod \"b00a505d-365a-4d52-b900-33073e3b4e84\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.749803 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-util\") pod \"b00a505d-365a-4d52-b900-33073e3b4e84\" (UID: \"b00a505d-365a-4d52-b900-33073e3b4e84\") " Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.751893 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-bundle" (OuterVolumeSpecName: "bundle") pod "b00a505d-365a-4d52-b900-33073e3b4e84" (UID: "b00a505d-365a-4d52-b900-33073e3b4e84"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.756632 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00a505d-365a-4d52-b900-33073e3b4e84-kube-api-access-92dlw" (OuterVolumeSpecName: "kube-api-access-92dlw") pod "b00a505d-365a-4d52-b900-33073e3b4e84" (UID: "b00a505d-365a-4d52-b900-33073e3b4e84"). InnerVolumeSpecName "kube-api-access-92dlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.768110 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-util" (OuterVolumeSpecName: "util") pod "b00a505d-365a-4d52-b900-33073e3b4e84" (UID: "b00a505d-365a-4d52-b900-33073e3b4e84"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.851428 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92dlw\" (UniqueName: \"kubernetes.io/projected/b00a505d-365a-4d52-b900-33073e3b4e84-kube-api-access-92dlw\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.851543 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-util\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:51 crc kubenswrapper[4892]: I1006 12:20:51.851569 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b00a505d-365a-4d52-b900-33073e3b4e84-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:20:52 crc kubenswrapper[4892]: I1006 12:20:52.277837 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" event={"ID":"b00a505d-365a-4d52-b900-33073e3b4e84","Type":"ContainerDied","Data":"9581ab15629ec8062dd6e0057eb5287e5924dcb82d8767e3550a83605d248b89"} Oct 06 12:20:52 crc kubenswrapper[4892]: I1006 12:20:52.277891 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9581ab15629ec8062dd6e0057eb5287e5924dcb82d8767e3550a83605d248b89" Oct 06 12:20:52 crc kubenswrapper[4892]: I1006 12:20:52.277981 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.439674 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96"] Oct 06 12:21:03 crc kubenswrapper[4892]: E1006 12:21:03.440474 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00a505d-365a-4d52-b900-33073e3b4e84" containerName="extract" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.440491 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00a505d-365a-4d52-b900-33073e3b4e84" containerName="extract" Oct 06 12:21:03 crc kubenswrapper[4892]: E1006 12:21:03.440700 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00a505d-365a-4d52-b900-33073e3b4e84" containerName="pull" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.440706 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00a505d-365a-4d52-b900-33073e3b4e84" containerName="pull" Oct 06 12:21:03 crc kubenswrapper[4892]: E1006 12:21:03.440719 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00a505d-365a-4d52-b900-33073e3b4e84" containerName="util" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.440726 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00a505d-365a-4d52-b900-33073e3b4e84" containerName="util" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.440819 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00a505d-365a-4d52-b900-33073e3b4e84" containerName="extract" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.441366 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.443092 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.443346 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.443742 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-5kgwz" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.443894 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.444913 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.463317 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96"] Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.486258 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wrctr"] Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.486467 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" podUID="1335413b-43df-4ec7-a45d-eb1094b8a125" containerName="controller-manager" containerID="cri-o://0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c" gracePeriod=30 Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.553252 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm"] Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.553482 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" podUID="f130811c-c622-4ee1-994b-cda735eaaf41" containerName="route-controller-manager" containerID="cri-o://8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698" gracePeriod=30 Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.615295 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-apiservice-cert\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.615359 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-webhook-cert\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.615400 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-446nk\" (UniqueName: \"kubernetes.io/projected/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-kube-api-access-446nk\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.716941 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-webhook-cert\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.717219 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-446nk\" (UniqueName: \"kubernetes.io/projected/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-kube-api-access-446nk\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.717273 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-apiservice-cert\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.722299 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-apiservice-cert\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.722833 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-webhook-cert\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.738220 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-446nk\" (UniqueName: \"kubernetes.io/projected/0e90157b-0ee2-45ab-b457-e4dd396bfcf4-kube-api-access-446nk\") pod \"metallb-operator-controller-manager-5dc65fffc5-fws96\" (UID: \"0e90157b-0ee2-45ab-b457-e4dd396bfcf4\") " pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.757754 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.881198 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms"] Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.890727 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.893755 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tb2fr" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.893894 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.894000 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.899623 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms"] Oct 06 12:21:03 crc kubenswrapper[4892]: I1006 12:21:03.933677 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.001837 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.027770 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n68d\" (UniqueName: \"kubernetes.io/projected/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-kube-api-access-6n68d\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.027808 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-webhook-cert\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.027935 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-apiservice-cert\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129685 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-config\") pod \"f130811c-c622-4ee1-994b-cda735eaaf41\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129734 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1335413b-43df-4ec7-a45d-eb1094b8a125-serving-cert\") pod \"1335413b-43df-4ec7-a45d-eb1094b8a125\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129751 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f130811c-c622-4ee1-994b-cda735eaaf41-serving-cert\") pod \"f130811c-c622-4ee1-994b-cda735eaaf41\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129773 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwvxk\" (UniqueName: \"kubernetes.io/projected/f130811c-c622-4ee1-994b-cda735eaaf41-kube-api-access-fwvxk\") pod \"f130811c-c622-4ee1-994b-cda735eaaf41\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129828 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vbsw\" (UniqueName: \"kubernetes.io/projected/1335413b-43df-4ec7-a45d-eb1094b8a125-kube-api-access-8vbsw\") pod \"1335413b-43df-4ec7-a45d-eb1094b8a125\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129843 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-proxy-ca-bundles\") pod \"1335413b-43df-4ec7-a45d-eb1094b8a125\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129858 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-client-ca\") pod \"f130811c-c622-4ee1-994b-cda735eaaf41\" (UID: \"f130811c-c622-4ee1-994b-cda735eaaf41\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129894 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-client-ca\") pod \"1335413b-43df-4ec7-a45d-eb1094b8a125\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.129910 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-config\") pod \"1335413b-43df-4ec7-a45d-eb1094b8a125\" (UID: \"1335413b-43df-4ec7-a45d-eb1094b8a125\") " Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.130123 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-apiservice-cert\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.130146 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n68d\" (UniqueName: \"kubernetes.io/projected/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-kube-api-access-6n68d\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.130167 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-webhook-cert\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.130852 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-config" (OuterVolumeSpecName: "config") pod "f130811c-c622-4ee1-994b-cda735eaaf41" (UID: "f130811c-c622-4ee1-994b-cda735eaaf41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.130883 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-client-ca" (OuterVolumeSpecName: "client-ca") pod "f130811c-c622-4ee1-994b-cda735eaaf41" (UID: "f130811c-c622-4ee1-994b-cda735eaaf41"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.131616 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-config" (OuterVolumeSpecName: "config") pod "1335413b-43df-4ec7-a45d-eb1094b8a125" (UID: "1335413b-43df-4ec7-a45d-eb1094b8a125"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.132082 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-client-ca" (OuterVolumeSpecName: "client-ca") pod "1335413b-43df-4ec7-a45d-eb1094b8a125" (UID: "1335413b-43df-4ec7-a45d-eb1094b8a125"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.135865 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1335413b-43df-4ec7-a45d-eb1094b8a125" (UID: "1335413b-43df-4ec7-a45d-eb1094b8a125"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.136284 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f130811c-c622-4ee1-994b-cda735eaaf41-kube-api-access-fwvxk" (OuterVolumeSpecName: "kube-api-access-fwvxk") pod "f130811c-c622-4ee1-994b-cda735eaaf41" (UID: "f130811c-c622-4ee1-994b-cda735eaaf41"). InnerVolumeSpecName "kube-api-access-fwvxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.137018 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f130811c-c622-4ee1-994b-cda735eaaf41-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f130811c-c622-4ee1-994b-cda735eaaf41" (UID: "f130811c-c622-4ee1-994b-cda735eaaf41"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.138468 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1335413b-43df-4ec7-a45d-eb1094b8a125-kube-api-access-8vbsw" (OuterVolumeSpecName: "kube-api-access-8vbsw") pod "1335413b-43df-4ec7-a45d-eb1094b8a125" (UID: "1335413b-43df-4ec7-a45d-eb1094b8a125"). InnerVolumeSpecName "kube-api-access-8vbsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.140076 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-apiservice-cert\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.140307 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1335413b-43df-4ec7-a45d-eb1094b8a125-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1335413b-43df-4ec7-a45d-eb1094b8a125" (UID: "1335413b-43df-4ec7-a45d-eb1094b8a125"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.141508 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-webhook-cert\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.153992 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n68d\" (UniqueName: \"kubernetes.io/projected/3c11a8ae-6eac-4709-9c88-aa7048e7bb08-kube-api-access-6n68d\") pod \"metallb-operator-webhook-server-cf886c89f-mc4ms\" (UID: \"3c11a8ae-6eac-4709-9c88-aa7048e7bb08\") " pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.215881 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233094 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vbsw\" (UniqueName: \"kubernetes.io/projected/1335413b-43df-4ec7-a45d-eb1094b8a125-kube-api-access-8vbsw\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233129 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233141 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233150 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233159 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1335413b-43df-4ec7-a45d-eb1094b8a125-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233167 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f130811c-c622-4ee1-994b-cda735eaaf41-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233176 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1335413b-43df-4ec7-a45d-eb1094b8a125-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233183 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f130811c-c622-4ee1-994b-cda735eaaf41-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.233191 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwvxk\" (UniqueName: \"kubernetes.io/projected/f130811c-c622-4ee1-994b-cda735eaaf41-kube-api-access-fwvxk\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.338996 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96"] Oct 06 12:21:04 crc kubenswrapper[4892]: W1006 12:21:04.355204 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e90157b_0ee2_45ab_b457_e4dd396bfcf4.slice/crio-0f047eac2495975b2d3220b040339cbb97b5d5d164635eecd49bb58b0fdc2f38 WatchSource:0}: Error finding container 0f047eac2495975b2d3220b040339cbb97b5d5d164635eecd49bb58b0fdc2f38: Status 404 returned error can't find the container with id 0f047eac2495975b2d3220b040339cbb97b5d5d164635eecd49bb58b0fdc2f38 Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.357034 4892 generic.go:334] "Generic (PLEG): container finished" podID="f130811c-c622-4ee1-994b-cda735eaaf41" containerID="8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698" exitCode=0 Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.357104 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.357130 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" event={"ID":"f130811c-c622-4ee1-994b-cda735eaaf41","Type":"ContainerDied","Data":"8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698"} Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.357159 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm" event={"ID":"f130811c-c622-4ee1-994b-cda735eaaf41","Type":"ContainerDied","Data":"a12ce925db6906c0ebe7074a79fe12ab2084dcd091196c8156f7623832951523"} Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.357185 4892 scope.go:117] "RemoveContainer" containerID="8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.361028 4892 generic.go:334] "Generic (PLEG): container finished" podID="1335413b-43df-4ec7-a45d-eb1094b8a125" containerID="0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c" exitCode=0 Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.361059 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" event={"ID":"1335413b-43df-4ec7-a45d-eb1094b8a125","Type":"ContainerDied","Data":"0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c"} Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.361150 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.361080 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wrctr" event={"ID":"1335413b-43df-4ec7-a45d-eb1094b8a125","Type":"ContainerDied","Data":"caf8ae585e03af8d95e558bac29784cac621a9b4eca2a862f50437e196d2e9d8"} Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.376507 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm"] Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.379818 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gh9gm"] Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.388203 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wrctr"] Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.393781 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wrctr"] Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.405872 4892 scope.go:117] "RemoveContainer" containerID="8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698" Oct 06 12:21:04 crc kubenswrapper[4892]: E1006 12:21:04.406256 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698\": container with ID starting with 8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698 not found: ID does not exist" containerID="8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.406307 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698"} err="failed to get container status \"8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698\": rpc error: code = NotFound desc = could not find container \"8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698\": container with ID starting with 8300334d8a7129a6042c08a0b38fec325dd5dc89231dcd5ede33a5ad434de698 not found: ID does not exist" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.406354 4892 scope.go:117] "RemoveContainer" containerID="0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.419363 4892 scope.go:117] "RemoveContainer" containerID="0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c" Oct 06 12:21:04 crc kubenswrapper[4892]: E1006 12:21:04.420212 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c\": container with ID starting with 0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c not found: ID does not exist" containerID="0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.420239 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c"} err="failed to get container status \"0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c\": rpc error: code = NotFound desc = could not find container \"0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c\": container with ID starting with 0617ae69f92aa6d536d50e0675a557d00f753f50043075567ba224053445141c not found: ID does not exist" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.427054 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms"] Oct 06 12:21:04 crc kubenswrapper[4892]: W1006 12:21:04.431340 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c11a8ae_6eac_4709_9c88_aa7048e7bb08.slice/crio-b4af1445b7391008965e055e8120284767d65ab542bb6eaa21a7856e4c9101dd WatchSource:0}: Error finding container b4af1445b7391008965e055e8120284767d65ab542bb6eaa21a7856e4c9101dd: Status 404 returned error can't find the container with id b4af1445b7391008965e055e8120284767d65ab542bb6eaa21a7856e4c9101dd Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.734904 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh"] Oct 06 12:21:04 crc kubenswrapper[4892]: E1006 12:21:04.735290 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1335413b-43df-4ec7-a45d-eb1094b8a125" containerName="controller-manager" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.735311 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1335413b-43df-4ec7-a45d-eb1094b8a125" containerName="controller-manager" Oct 06 12:21:04 crc kubenswrapper[4892]: E1006 12:21:04.735371 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f130811c-c622-4ee1-994b-cda735eaaf41" containerName="route-controller-manager" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.735387 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f130811c-c622-4ee1-994b-cda735eaaf41" containerName="route-controller-manager" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.735561 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f130811c-c622-4ee1-994b-cda735eaaf41" containerName="route-controller-manager" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.735601 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1335413b-43df-4ec7-a45d-eb1094b8a125" containerName="controller-manager" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.736268 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.738066 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l"] Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.738831 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.738862 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.738854 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.739568 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.739925 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.740004 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.741128 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.741285 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.741796 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.741919 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.742026 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.743946 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.752401 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.754639 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.756112 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l"] Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.762807 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh"] Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.841585 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-proxy-ca-bundles\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.841629 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shvwn\" (UniqueName: \"kubernetes.io/projected/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-kube-api-access-shvwn\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.841656 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcb9t\" (UniqueName: \"kubernetes.io/projected/8e77e54b-a880-4131-8873-5aa507b5e286-kube-api-access-wcb9t\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.841712 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-config\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.841731 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e77e54b-a880-4131-8873-5aa507b5e286-serving-cert\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.841861 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-client-ca\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.841904 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-config\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.842030 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-serving-cert\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.842134 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-client-ca\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.942828 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-client-ca\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.942904 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shvwn\" (UniqueName: \"kubernetes.io/projected/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-kube-api-access-shvwn\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.942933 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-proxy-ca-bundles\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.942962 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcb9t\" (UniqueName: \"kubernetes.io/projected/8e77e54b-a880-4131-8873-5aa507b5e286-kube-api-access-wcb9t\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.943005 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-config\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.943027 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e77e54b-a880-4131-8873-5aa507b5e286-serving-cert\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.943052 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-client-ca\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.943074 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-config\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.943115 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-serving-cert\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.943972 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-client-ca\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.944459 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-config\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.944573 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-proxy-ca-bundles\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.945141 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-client-ca\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.946034 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-config\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.947583 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e77e54b-a880-4131-8873-5aa507b5e286-serving-cert\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.960512 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-serving-cert\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.963257 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcb9t\" (UniqueName: \"kubernetes.io/projected/8e77e54b-a880-4131-8873-5aa507b5e286-kube-api-access-wcb9t\") pod \"controller-manager-6bcbd4fd74-sp68l\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:04 crc kubenswrapper[4892]: I1006 12:21:04.972265 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shvwn\" (UniqueName: \"kubernetes.io/projected/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-kube-api-access-shvwn\") pod \"route-controller-manager-64d6fd6558-lbcbh\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:05 crc kubenswrapper[4892]: I1006 12:21:05.064802 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:05 crc kubenswrapper[4892]: I1006 12:21:05.084761 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:05 crc kubenswrapper[4892]: I1006 12:21:05.131596 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l"] Oct 06 12:21:05 crc kubenswrapper[4892]: I1006 12:21:05.136292 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh"] Oct 06 12:21:05 crc kubenswrapper[4892]: I1006 12:21:05.369211 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" event={"ID":"0e90157b-0ee2-45ab-b457-e4dd396bfcf4","Type":"ContainerStarted","Data":"0f047eac2495975b2d3220b040339cbb97b5d5d164635eecd49bb58b0fdc2f38"} Oct 06 12:21:05 crc kubenswrapper[4892]: I1006 12:21:05.372250 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" event={"ID":"3c11a8ae-6eac-4709-9c88-aa7048e7bb08","Type":"ContainerStarted","Data":"b4af1445b7391008965e055e8120284767d65ab542bb6eaa21a7856e4c9101dd"} Oct 06 12:21:05 crc kubenswrapper[4892]: I1006 12:21:05.536298 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh"] Oct 06 12:21:05 crc kubenswrapper[4892]: I1006 12:21:05.625471 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l"] Oct 06 12:21:05 crc kubenswrapper[4892]: W1006 12:21:05.638004 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e77e54b_a880_4131_8873_5aa507b5e286.slice/crio-79925ba5a26e8c0a2198e6377c25488f373680a4b5aa41039032da48cda806d7 WatchSource:0}: Error finding container 79925ba5a26e8c0a2198e6377c25488f373680a4b5aa41039032da48cda806d7: Status 404 returned error can't find the container with id 79925ba5a26e8c0a2198e6377c25488f373680a4b5aa41039032da48cda806d7 Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.176903 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1335413b-43df-4ec7-a45d-eb1094b8a125" path="/var/lib/kubelet/pods/1335413b-43df-4ec7-a45d-eb1094b8a125/volumes" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.178009 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f130811c-c622-4ee1-994b-cda735eaaf41" path="/var/lib/kubelet/pods/f130811c-c622-4ee1-994b-cda735eaaf41/volumes" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.388234 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" podUID="8e77e54b-a880-4131-8873-5aa507b5e286" containerName="controller-manager" containerID="cri-o://9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8" gracePeriod=30 Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.388497 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" event={"ID":"8e77e54b-a880-4131-8873-5aa507b5e286","Type":"ContainerStarted","Data":"9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8"} Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.388522 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" event={"ID":"8e77e54b-a880-4131-8873-5aa507b5e286","Type":"ContainerStarted","Data":"79925ba5a26e8c0a2198e6377c25488f373680a4b5aa41039032da48cda806d7"} Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.388911 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.391187 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" event={"ID":"8869a8f3-bf51-4336-a30e-2b9b2a56aab1","Type":"ContainerStarted","Data":"2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830"} Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.391265 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" event={"ID":"8869a8f3-bf51-4336-a30e-2b9b2a56aab1","Type":"ContainerStarted","Data":"e3bf1e376649d401756eeb13c1167ece536e656cc6e0d5bab000fd504517445d"} Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.391292 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" podUID="8869a8f3-bf51-4336-a30e-2b9b2a56aab1" containerName="route-controller-manager" containerID="cri-o://2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830" gracePeriod=30 Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.391507 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.398357 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.402939 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.415559 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" podStartSLOduration=3.415538717 podStartE2EDuration="3.415538717s" podCreationTimestamp="2025-10-06 12:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:21:06.412408867 +0000 UTC m=+752.962114632" watchObservedRunningTime="2025-10-06 12:21:06.415538717 +0000 UTC m=+752.965244472" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.454828 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" podStartSLOduration=3.45480918 podStartE2EDuration="3.45480918s" podCreationTimestamp="2025-10-06 12:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:21:06.450292071 +0000 UTC m=+752.999997836" watchObservedRunningTime="2025-10-06 12:21:06.45480918 +0000 UTC m=+753.004514945" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.899681 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.901289 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.931193 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q"] Oct 06 12:21:06 crc kubenswrapper[4892]: E1006 12:21:06.931446 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8869a8f3-bf51-4336-a30e-2b9b2a56aab1" containerName="route-controller-manager" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.931458 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8869a8f3-bf51-4336-a30e-2b9b2a56aab1" containerName="route-controller-manager" Oct 06 12:21:06 crc kubenswrapper[4892]: E1006 12:21:06.931466 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e77e54b-a880-4131-8873-5aa507b5e286" containerName="controller-manager" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.931472 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e77e54b-a880-4131-8873-5aa507b5e286" containerName="controller-manager" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.931567 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e77e54b-a880-4131-8873-5aa507b5e286" containerName="controller-manager" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.931581 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8869a8f3-bf51-4336-a30e-2b9b2a56aab1" containerName="route-controller-manager" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.931941 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:06 crc kubenswrapper[4892]: I1006 12:21:06.956697 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q"] Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.068764 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-proxy-ca-bundles\") pod \"8e77e54b-a880-4131-8873-5aa507b5e286\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.068812 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcb9t\" (UniqueName: \"kubernetes.io/projected/8e77e54b-a880-4131-8873-5aa507b5e286-kube-api-access-wcb9t\") pod \"8e77e54b-a880-4131-8873-5aa507b5e286\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.068843 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-client-ca\") pod \"8e77e54b-a880-4131-8873-5aa507b5e286\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.068876 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-config\") pod \"8e77e54b-a880-4131-8873-5aa507b5e286\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.068892 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-config\") pod \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.069607 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-config" (OuterVolumeSpecName: "config") pod "8869a8f3-bf51-4336-a30e-2b9b2a56aab1" (UID: "8869a8f3-bf51-4336-a30e-2b9b2a56aab1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.069986 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8e77e54b-a880-4131-8873-5aa507b5e286" (UID: "8e77e54b-a880-4131-8873-5aa507b5e286"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.070451 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e77e54b-a880-4131-8873-5aa507b5e286-serving-cert\") pod \"8e77e54b-a880-4131-8873-5aa507b5e286\" (UID: \"8e77e54b-a880-4131-8873-5aa507b5e286\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.072369 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-config" (OuterVolumeSpecName: "config") pod "8e77e54b-a880-4131-8873-5aa507b5e286" (UID: "8e77e54b-a880-4131-8873-5aa507b5e286"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.073692 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shvwn\" (UniqueName: \"kubernetes.io/projected/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-kube-api-access-shvwn\") pod \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.073878 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-client-ca\") pod \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.073926 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-serving-cert\") pod \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\" (UID: \"8869a8f3-bf51-4336-a30e-2b9b2a56aab1\") " Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.074040 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-client-ca" (OuterVolumeSpecName: "client-ca") pod "8e77e54b-a880-4131-8873-5aa507b5e286" (UID: "8e77e54b-a880-4131-8873-5aa507b5e286"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.074882 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-client-ca" (OuterVolumeSpecName: "client-ca") pod "8869a8f3-bf51-4336-a30e-2b9b2a56aab1" (UID: "8869a8f3-bf51-4336-a30e-2b9b2a56aab1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.075011 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e77e54b-a880-4131-8873-5aa507b5e286-kube-api-access-wcb9t" (OuterVolumeSpecName: "kube-api-access-wcb9t") pod "8e77e54b-a880-4131-8873-5aa507b5e286" (UID: "8e77e54b-a880-4131-8873-5aa507b5e286"). InnerVolumeSpecName "kube-api-access-wcb9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.074963 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzdm\" (UniqueName: \"kubernetes.io/projected/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-kube-api-access-rxzdm\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.075420 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-serving-cert\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.075473 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-client-ca\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.075508 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-config\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.076208 4892 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.076256 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcb9t\" (UniqueName: \"kubernetes.io/projected/8e77e54b-a880-4131-8873-5aa507b5e286-kube-api-access-wcb9t\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.076280 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.076294 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e77e54b-a880-4131-8873-5aa507b5e286-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.076309 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.076321 4892 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-client-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.080407 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-kube-api-access-shvwn" (OuterVolumeSpecName: "kube-api-access-shvwn") pod "8869a8f3-bf51-4336-a30e-2b9b2a56aab1" (UID: "8869a8f3-bf51-4336-a30e-2b9b2a56aab1"). InnerVolumeSpecName "kube-api-access-shvwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.083627 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8869a8f3-bf51-4336-a30e-2b9b2a56aab1" (UID: "8869a8f3-bf51-4336-a30e-2b9b2a56aab1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.091033 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e77e54b-a880-4131-8873-5aa507b5e286-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8e77e54b-a880-4131-8873-5aa507b5e286" (UID: "8e77e54b-a880-4131-8873-5aa507b5e286"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.177816 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzdm\" (UniqueName: \"kubernetes.io/projected/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-kube-api-access-rxzdm\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.177891 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-serving-cert\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.177922 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-client-ca\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.177950 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-config\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.179201 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-client-ca\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.179341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-config\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.179374 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e77e54b-a880-4131-8873-5aa507b5e286-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.179396 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shvwn\" (UniqueName: \"kubernetes.io/projected/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-kube-api-access-shvwn\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.179412 4892 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8869a8f3-bf51-4336-a30e-2b9b2a56aab1-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.181613 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-serving-cert\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.193496 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzdm\" (UniqueName: \"kubernetes.io/projected/c7041a7a-3e90-40eb-8a6b-2b584ed891eb-kube-api-access-rxzdm\") pod \"route-controller-manager-66c656748c-8hn5q\" (UID: \"c7041a7a-3e90-40eb-8a6b-2b584ed891eb\") " pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.251931 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.398530 4892 generic.go:334] "Generic (PLEG): container finished" podID="8e77e54b-a880-4131-8873-5aa507b5e286" containerID="9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8" exitCode=0 Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.398605 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" event={"ID":"8e77e54b-a880-4131-8873-5aa507b5e286","Type":"ContainerDied","Data":"9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8"} Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.398634 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" event={"ID":"8e77e54b-a880-4131-8873-5aa507b5e286","Type":"ContainerDied","Data":"79925ba5a26e8c0a2198e6377c25488f373680a4b5aa41039032da48cda806d7"} Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.398655 4892 scope.go:117] "RemoveContainer" containerID="9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.398763 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.410892 4892 generic.go:334] "Generic (PLEG): container finished" podID="8869a8f3-bf51-4336-a30e-2b9b2a56aab1" containerID="2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830" exitCode=0 Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.410922 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.410943 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" event={"ID":"8869a8f3-bf51-4336-a30e-2b9b2a56aab1","Type":"ContainerDied","Data":"2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830"} Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.410972 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh" event={"ID":"8869a8f3-bf51-4336-a30e-2b9b2a56aab1","Type":"ContainerDied","Data":"e3bf1e376649d401756eeb13c1167ece536e656cc6e0d5bab000fd504517445d"} Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.455804 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l"] Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.465836 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bcbd4fd74-sp68l"] Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.471131 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh"] Oct 06 12:21:07 crc kubenswrapper[4892]: I1006 12:21:07.474908 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64d6fd6558-lbcbh"] Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.158353 4892 scope.go:117] "RemoveContainer" containerID="9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8" Oct 06 12:21:08 crc kubenswrapper[4892]: E1006 12:21:08.158979 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8\": container with ID starting with 9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8 not found: ID does not exist" containerID="9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8" Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.159004 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8"} err="failed to get container status \"9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8\": rpc error: code = NotFound desc = could not find container \"9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8\": container with ID starting with 9a34025561042af988f5b3a00b030524c6c37ec54840538b3eba79b2e17abda8 not found: ID does not exist" Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.159024 4892 scope.go:117] "RemoveContainer" containerID="2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830" Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.177313 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8869a8f3-bf51-4336-a30e-2b9b2a56aab1" path="/var/lib/kubelet/pods/8869a8f3-bf51-4336-a30e-2b9b2a56aab1/volumes" Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.177988 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e77e54b-a880-4131-8873-5aa507b5e286" path="/var/lib/kubelet/pods/8e77e54b-a880-4131-8873-5aa507b5e286/volumes" Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.181485 4892 scope.go:117] "RemoveContainer" containerID="2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830" Oct 06 12:21:08 crc kubenswrapper[4892]: E1006 12:21:08.182024 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830\": container with ID starting with 2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830 not found: ID does not exist" containerID="2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830" Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.182052 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830"} err="failed to get container status \"2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830\": rpc error: code = NotFound desc = could not find container \"2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830\": container with ID starting with 2d5f8eef386fe9dcf3fda493c787ae4cb00b00d6f46ae0906324484e70809830 not found: ID does not exist" Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.360510 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q"] Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.418482 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" event={"ID":"0e90157b-0ee2-45ab-b457-e4dd396bfcf4","Type":"ContainerStarted","Data":"719cb815ce0a78aa8e6296013b15bf3da9d45698324cd5bdd6f15795a7eabe1c"} Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.419641 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:08 crc kubenswrapper[4892]: I1006 12:21:08.440455 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" podStartSLOduration=1.566433994 podStartE2EDuration="5.440439118s" podCreationTimestamp="2025-10-06 12:21:03 +0000 UTC" firstStartedPulling="2025-10-06 12:21:04.361125029 +0000 UTC m=+750.910830794" lastFinishedPulling="2025-10-06 12:21:08.235130163 +0000 UTC m=+754.784835918" observedRunningTime="2025-10-06 12:21:08.436821565 +0000 UTC m=+754.986527330" watchObservedRunningTime="2025-10-06 12:21:08.440439118 +0000 UTC m=+754.990144883" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.747636 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d458d4b57-wr9hg"] Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.749130 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.754091 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.754283 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.756114 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.756347 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.756378 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.758342 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.762122 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d458d4b57-wr9hg"] Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.762160 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.918607 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-client-ca\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.918681 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-config\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.918709 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-serving-cert\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.918759 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt8mb\" (UniqueName: \"kubernetes.io/projected/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-kube-api-access-jt8mb\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:09 crc kubenswrapper[4892]: I1006 12:21:09.918823 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-proxy-ca-bundles\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.020445 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-proxy-ca-bundles\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.020552 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-client-ca\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.020608 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-config\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.020652 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-serving-cert\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.020683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt8mb\" (UniqueName: \"kubernetes.io/projected/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-kube-api-access-jt8mb\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.021479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-client-ca\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.021887 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-config\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.022161 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-proxy-ca-bundles\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.029111 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-serving-cert\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.043026 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt8mb\" (UniqueName: \"kubernetes.io/projected/7341b409-ec4c-4e44-95bd-a7d4f3d90bb3-kube-api-access-jt8mb\") pod \"controller-manager-7d458d4b57-wr9hg\" (UID: \"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3\") " pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.084099 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.445081 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" event={"ID":"3c11a8ae-6eac-4709-9c88-aa7048e7bb08","Type":"ContainerStarted","Data":"a710e6f327f5c1aa2c6305c43751930844a7571ba56c5668de7d49bfea96cd15"} Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.445803 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.446960 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" event={"ID":"c7041a7a-3e90-40eb-8a6b-2b584ed891eb","Type":"ContainerStarted","Data":"12464eba92a41f28b7fc441d28165738476045805d8812d85a4c10b5432ffdf5"} Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.447023 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" event={"ID":"c7041a7a-3e90-40eb-8a6b-2b584ed891eb","Type":"ContainerStarted","Data":"642784623b812f264c20e5b592298af389d9f059cbf3b0c8f8e28e9da7031bd9"} Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.476045 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" podStartSLOduration=2.090858141 podStartE2EDuration="7.476025877s" podCreationTimestamp="2025-10-06 12:21:03 +0000 UTC" firstStartedPulling="2025-10-06 12:21:04.43418389 +0000 UTC m=+750.983889655" lastFinishedPulling="2025-10-06 12:21:09.819351626 +0000 UTC m=+756.369057391" observedRunningTime="2025-10-06 12:21:10.473270148 +0000 UTC m=+757.022975913" watchObservedRunningTime="2025-10-06 12:21:10.476025877 +0000 UTC m=+757.025731642" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.508222 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" podStartSLOduration=5.508203427 podStartE2EDuration="5.508203427s" podCreationTimestamp="2025-10-06 12:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:21:10.505074188 +0000 UTC m=+757.054779953" watchObservedRunningTime="2025-10-06 12:21:10.508203427 +0000 UTC m=+757.057909192" Oct 06 12:21:10 crc kubenswrapper[4892]: I1006 12:21:10.514357 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d458d4b57-wr9hg"] Oct 06 12:21:11 crc kubenswrapper[4892]: I1006 12:21:11.456128 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" event={"ID":"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3","Type":"ContainerStarted","Data":"0c943f5409ba95aa32ee5d0b93c43084de17cffe87a336817a94a882513f76a4"} Oct 06 12:21:11 crc kubenswrapper[4892]: I1006 12:21:11.458199 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:11 crc kubenswrapper[4892]: I1006 12:21:11.458445 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" event={"ID":"7341b409-ec4c-4e44-95bd-a7d4f3d90bb3","Type":"ContainerStarted","Data":"333b1374e96dec83d5dc43e9117e792f8930c3f1c378d3a90196de4f39400817"} Oct 06 12:21:11 crc kubenswrapper[4892]: I1006 12:21:11.468956 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66c656748c-8hn5q" Oct 06 12:21:11 crc kubenswrapper[4892]: I1006 12:21:11.478496 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" podStartSLOduration=6.478468661 podStartE2EDuration="6.478468661s" podCreationTimestamp="2025-10-06 12:21:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:21:11.477619916 +0000 UTC m=+758.027325721" watchObservedRunningTime="2025-10-06 12:21:11.478468661 +0000 UTC m=+758.028174466" Oct 06 12:21:11 crc kubenswrapper[4892]: I1006 12:21:11.759406 4892 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 06 12:21:12 crc kubenswrapper[4892]: I1006 12:21:12.462270 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:12 crc kubenswrapper[4892]: I1006 12:21:12.468091 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d458d4b57-wr9hg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.760808 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7v5cg"] Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.762928 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.773903 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7v5cg"] Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.877752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-utilities\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.877871 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-catalog-content\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.878122 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jczzd\" (UniqueName: \"kubernetes.io/projected/8e7d3a73-8af6-415c-8f3d-3d42934aef99-kube-api-access-jczzd\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.979146 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jczzd\" (UniqueName: \"kubernetes.io/projected/8e7d3a73-8af6-415c-8f3d-3d42934aef99-kube-api-access-jczzd\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.979240 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-utilities\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.979261 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-catalog-content\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.979736 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-catalog-content\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.979938 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-utilities\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:21 crc kubenswrapper[4892]: I1006 12:21:21.996629 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jczzd\" (UniqueName: \"kubernetes.io/projected/8e7d3a73-8af6-415c-8f3d-3d42934aef99-kube-api-access-jczzd\") pod \"redhat-operators-7v5cg\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:22 crc kubenswrapper[4892]: I1006 12:21:22.094163 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:22 crc kubenswrapper[4892]: I1006 12:21:22.575723 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7v5cg"] Oct 06 12:21:23 crc kubenswrapper[4892]: I1006 12:21:23.532811 4892 generic.go:334] "Generic (PLEG): container finished" podID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerID="bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2" exitCode=0 Oct 06 12:21:23 crc kubenswrapper[4892]: I1006 12:21:23.532947 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7v5cg" event={"ID":"8e7d3a73-8af6-415c-8f3d-3d42934aef99","Type":"ContainerDied","Data":"bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2"} Oct 06 12:21:23 crc kubenswrapper[4892]: I1006 12:21:23.533116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7v5cg" event={"ID":"8e7d3a73-8af6-415c-8f3d-3d42934aef99","Type":"ContainerStarted","Data":"4044fabbbc50b0829203c146583f66c22302a207208f43fcd347918ec2d1f7d5"} Oct 06 12:21:24 crc kubenswrapper[4892]: I1006 12:21:24.224634 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-cf886c89f-mc4ms" Oct 06 12:21:25 crc kubenswrapper[4892]: I1006 12:21:25.545918 4892 generic.go:334] "Generic (PLEG): container finished" podID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerID="59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014" exitCode=0 Oct 06 12:21:25 crc kubenswrapper[4892]: I1006 12:21:25.545999 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7v5cg" event={"ID":"8e7d3a73-8af6-415c-8f3d-3d42934aef99","Type":"ContainerDied","Data":"59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014"} Oct 06 12:21:26 crc kubenswrapper[4892]: I1006 12:21:26.554758 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7v5cg" event={"ID":"8e7d3a73-8af6-415c-8f3d-3d42934aef99","Type":"ContainerStarted","Data":"b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc"} Oct 06 12:21:26 crc kubenswrapper[4892]: I1006 12:21:26.578426 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7v5cg" podStartSLOduration=3.09567824 podStartE2EDuration="5.578404642s" podCreationTimestamp="2025-10-06 12:21:21 +0000 UTC" firstStartedPulling="2025-10-06 12:21:23.535316085 +0000 UTC m=+770.085021890" lastFinishedPulling="2025-10-06 12:21:26.018042487 +0000 UTC m=+772.567748292" observedRunningTime="2025-10-06 12:21:26.578401942 +0000 UTC m=+773.128107737" watchObservedRunningTime="2025-10-06 12:21:26.578404642 +0000 UTC m=+773.128110427" Oct 06 12:21:32 crc kubenswrapper[4892]: I1006 12:21:32.094853 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:32 crc kubenswrapper[4892]: I1006 12:21:32.094947 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:32 crc kubenswrapper[4892]: I1006 12:21:32.226905 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:32 crc kubenswrapper[4892]: I1006 12:21:32.658454 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:34 crc kubenswrapper[4892]: I1006 12:21:34.556647 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7v5cg"] Oct 06 12:21:34 crc kubenswrapper[4892]: I1006 12:21:34.616505 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7v5cg" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerName="registry-server" containerID="cri-o://b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc" gracePeriod=2 Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.211077 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.377763 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-utilities\") pod \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.377890 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jczzd\" (UniqueName: \"kubernetes.io/projected/8e7d3a73-8af6-415c-8f3d-3d42934aef99-kube-api-access-jczzd\") pod \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.378029 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-catalog-content\") pod \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\" (UID: \"8e7d3a73-8af6-415c-8f3d-3d42934aef99\") " Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.379540 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-utilities" (OuterVolumeSpecName: "utilities") pod "8e7d3a73-8af6-415c-8f3d-3d42934aef99" (UID: "8e7d3a73-8af6-415c-8f3d-3d42934aef99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.385696 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e7d3a73-8af6-415c-8f3d-3d42934aef99-kube-api-access-jczzd" (OuterVolumeSpecName: "kube-api-access-jczzd") pod "8e7d3a73-8af6-415c-8f3d-3d42934aef99" (UID: "8e7d3a73-8af6-415c-8f3d-3d42934aef99"). InnerVolumeSpecName "kube-api-access-jczzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.479775 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.479825 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jczzd\" (UniqueName: \"kubernetes.io/projected/8e7d3a73-8af6-415c-8f3d-3d42934aef99-kube-api-access-jczzd\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.625310 4892 generic.go:334] "Generic (PLEG): container finished" podID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerID="b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc" exitCode=0 Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.625382 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7v5cg" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.625364 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7v5cg" event={"ID":"8e7d3a73-8af6-415c-8f3d-3d42934aef99","Type":"ContainerDied","Data":"b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc"} Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.625854 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7v5cg" event={"ID":"8e7d3a73-8af6-415c-8f3d-3d42934aef99","Type":"ContainerDied","Data":"4044fabbbc50b0829203c146583f66c22302a207208f43fcd347918ec2d1f7d5"} Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.625877 4892 scope.go:117] "RemoveContainer" containerID="b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.645864 4892 scope.go:117] "RemoveContainer" containerID="59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.662744 4892 scope.go:117] "RemoveContainer" containerID="bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.681475 4892 scope.go:117] "RemoveContainer" containerID="b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc" Oct 06 12:21:35 crc kubenswrapper[4892]: E1006 12:21:35.681837 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc\": container with ID starting with b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc not found: ID does not exist" containerID="b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.681903 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc"} err="failed to get container status \"b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc\": rpc error: code = NotFound desc = could not find container \"b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc\": container with ID starting with b98ed321e0ccb2936c5847ad9dee611a98866d8697101f95f6673fb462216fbc not found: ID does not exist" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.681953 4892 scope.go:117] "RemoveContainer" containerID="59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014" Oct 06 12:21:35 crc kubenswrapper[4892]: E1006 12:21:35.682481 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014\": container with ID starting with 59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014 not found: ID does not exist" containerID="59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.682521 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014"} err="failed to get container status \"59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014\": rpc error: code = NotFound desc = could not find container \"59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014\": container with ID starting with 59908a39007dd0bbf5d592fe8ef3651cbe917329b69ac812d29733d4b585e014 not found: ID does not exist" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.682546 4892 scope.go:117] "RemoveContainer" containerID="bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2" Oct 06 12:21:35 crc kubenswrapper[4892]: E1006 12:21:35.682808 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2\": container with ID starting with bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2 not found: ID does not exist" containerID="bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2" Oct 06 12:21:35 crc kubenswrapper[4892]: I1006 12:21:35.682831 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2"} err="failed to get container status \"bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2\": rpc error: code = NotFound desc = could not find container \"bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2\": container with ID starting with bc3845e150a70aef3c7e9fdee0a8e72df51641b24e61cb3841256e5c395db2a2 not found: ID does not exist" Oct 06 12:21:36 crc kubenswrapper[4892]: I1006 12:21:36.706729 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e7d3a73-8af6-415c-8f3d-3d42934aef99" (UID: "8e7d3a73-8af6-415c-8f3d-3d42934aef99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:21:36 crc kubenswrapper[4892]: I1006 12:21:36.797974 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e7d3a73-8af6-415c-8f3d-3d42934aef99-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:36 crc kubenswrapper[4892]: I1006 12:21:36.870490 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7v5cg"] Oct 06 12:21:36 crc kubenswrapper[4892]: I1006 12:21:36.877084 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7v5cg"] Oct 06 12:21:38 crc kubenswrapper[4892]: I1006 12:21:38.179599 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" path="/var/lib/kubelet/pods/8e7d3a73-8af6-415c-8f3d-3d42934aef99/volumes" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.770286 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dt44c"] Oct 06 12:21:39 crc kubenswrapper[4892]: E1006 12:21:39.770692 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerName="registry-server" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.770713 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerName="registry-server" Oct 06 12:21:39 crc kubenswrapper[4892]: E1006 12:21:39.770741 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerName="extract-content" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.770754 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerName="extract-content" Oct 06 12:21:39 crc kubenswrapper[4892]: E1006 12:21:39.770777 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerName="extract-utilities" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.770789 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerName="extract-utilities" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.774193 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e7d3a73-8af6-415c-8f3d-3d42934aef99" containerName="registry-server" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.775722 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.787406 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt44c"] Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.943165 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd8rm\" (UniqueName: \"kubernetes.io/projected/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-kube-api-access-cd8rm\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.943484 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-catalog-content\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:39 crc kubenswrapper[4892]: I1006 12:21:39.943574 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-utilities\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.044782 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd8rm\" (UniqueName: \"kubernetes.io/projected/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-kube-api-access-cd8rm\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.045014 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-catalog-content\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.045108 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-utilities\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.045657 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-utilities\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.045734 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-catalog-content\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.080776 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd8rm\" (UniqueName: \"kubernetes.io/projected/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-kube-api-access-cd8rm\") pod \"redhat-marketplace-dt44c\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.107597 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.542553 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt44c"] Oct 06 12:21:40 crc kubenswrapper[4892]: I1006 12:21:40.672500 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt44c" event={"ID":"f2522d11-54ae-40b7-83d7-ac88b19a9bb9","Type":"ContainerStarted","Data":"324c72b837213e142706d81f5b76bdcbc76df46b25f55dc05139879a8ae4326b"} Oct 06 12:21:41 crc kubenswrapper[4892]: I1006 12:21:41.682119 4892 generic.go:334] "Generic (PLEG): container finished" podID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerID="ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb" exitCode=0 Oct 06 12:21:41 crc kubenswrapper[4892]: I1006 12:21:41.682229 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt44c" event={"ID":"f2522d11-54ae-40b7-83d7-ac88b19a9bb9","Type":"ContainerDied","Data":"ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb"} Oct 06 12:21:42 crc kubenswrapper[4892]: I1006 12:21:42.693283 4892 generic.go:334] "Generic (PLEG): container finished" podID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerID="0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d" exitCode=0 Oct 06 12:21:42 crc kubenswrapper[4892]: I1006 12:21:42.693374 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt44c" event={"ID":"f2522d11-54ae-40b7-83d7-ac88b19a9bb9","Type":"ContainerDied","Data":"0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d"} Oct 06 12:21:43 crc kubenswrapper[4892]: I1006 12:21:43.706279 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt44c" event={"ID":"f2522d11-54ae-40b7-83d7-ac88b19a9bb9","Type":"ContainerStarted","Data":"2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c"} Oct 06 12:21:43 crc kubenswrapper[4892]: I1006 12:21:43.738535 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dt44c" podStartSLOduration=3.327694284 podStartE2EDuration="4.738513355s" podCreationTimestamp="2025-10-06 12:21:39 +0000 UTC" firstStartedPulling="2025-10-06 12:21:41.685270662 +0000 UTC m=+788.234976437" lastFinishedPulling="2025-10-06 12:21:43.096089753 +0000 UTC m=+789.645795508" observedRunningTime="2025-10-06 12:21:43.738237197 +0000 UTC m=+790.287943002" watchObservedRunningTime="2025-10-06 12:21:43.738513355 +0000 UTC m=+790.288219130" Oct 06 12:21:43 crc kubenswrapper[4892]: I1006 12:21:43.761692 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5dc65fffc5-fws96" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.496107 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-zqfvv"] Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.500213 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.503925 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kzsmg" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.504148 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.504169 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.511064 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5"] Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.511764 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.514552 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.549487 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5"] Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551028 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/344945cb-67e7-4600-a300-676dcddc3659-metrics-certs\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551082 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-frr-sockets\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551125 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw9vh\" (UniqueName: \"kubernetes.io/projected/9f2d191f-6b44-4c22-bddd-53bd6237ba29-kube-api-access-rw9vh\") pod \"frr-k8s-webhook-server-64bf5d555-pcrl5\" (UID: \"9f2d191f-6b44-4c22-bddd-53bd6237ba29\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551160 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f2d191f-6b44-4c22-bddd-53bd6237ba29-cert\") pod \"frr-k8s-webhook-server-64bf5d555-pcrl5\" (UID: \"9f2d191f-6b44-4c22-bddd-53bd6237ba29\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551205 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-reloader\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551221 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/344945cb-67e7-4600-a300-676dcddc3659-frr-startup\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551239 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-metrics\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551271 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-frr-conf\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.551290 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45r86\" (UniqueName: \"kubernetes.io/projected/344945cb-67e7-4600-a300-676dcddc3659-kube-api-access-45r86\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.583597 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-ffpm9"] Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.584570 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.587684 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-xnc9c"] Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.589650 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.589848 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.590066 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.590487 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-xsl8d" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.591034 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.592559 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.610644 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-xnc9c"] Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652239 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-reloader\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652280 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-cert\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652303 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/344945cb-67e7-4600-a300-676dcddc3659-frr-startup\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652343 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-metrics\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652367 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652394 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metrics-certs\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652434 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-frr-conf\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652460 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metallb-excludel2\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652482 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45r86\" (UniqueName: \"kubernetes.io/projected/344945cb-67e7-4600-a300-676dcddc3659-kube-api-access-45r86\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652509 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd69h\" (UniqueName: \"kubernetes.io/projected/619837d2-8e2d-42d7-a34d-a3c1e39d213b-kube-api-access-wd69h\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652540 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-metrics-certs\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652565 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/344945cb-67e7-4600-a300-676dcddc3659-metrics-certs\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652586 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ktj9\" (UniqueName: \"kubernetes.io/projected/8309be58-c242-4799-a7db-ebb0171b23de-kube-api-access-6ktj9\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652611 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-frr-sockets\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652640 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw9vh\" (UniqueName: \"kubernetes.io/projected/9f2d191f-6b44-4c22-bddd-53bd6237ba29-kube-api-access-rw9vh\") pod \"frr-k8s-webhook-server-64bf5d555-pcrl5\" (UID: \"9f2d191f-6b44-4c22-bddd-53bd6237ba29\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652671 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f2d191f-6b44-4c22-bddd-53bd6237ba29-cert\") pod \"frr-k8s-webhook-server-64bf5d555-pcrl5\" (UID: \"9f2d191f-6b44-4c22-bddd-53bd6237ba29\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652690 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-reloader\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652706 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-metrics\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: E1006 12:21:44.652852 4892 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.652888 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-frr-conf\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: E1006 12:21:44.652900 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/344945cb-67e7-4600-a300-676dcddc3659-metrics-certs podName:344945cb-67e7-4600-a300-676dcddc3659 nodeName:}" failed. No retries permitted until 2025-10-06 12:21:45.1528829 +0000 UTC m=+791.702588675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/344945cb-67e7-4600-a300-676dcddc3659-metrics-certs") pod "frr-k8s-zqfvv" (UID: "344945cb-67e7-4600-a300-676dcddc3659") : secret "frr-k8s-certs-secret" not found Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.653150 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/344945cb-67e7-4600-a300-676dcddc3659-frr-sockets\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.653433 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/344945cb-67e7-4600-a300-676dcddc3659-frr-startup\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.670283 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9f2d191f-6b44-4c22-bddd-53bd6237ba29-cert\") pod \"frr-k8s-webhook-server-64bf5d555-pcrl5\" (UID: \"9f2d191f-6b44-4c22-bddd-53bd6237ba29\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.672378 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45r86\" (UniqueName: \"kubernetes.io/projected/344945cb-67e7-4600-a300-676dcddc3659-kube-api-access-45r86\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.673062 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw9vh\" (UniqueName: \"kubernetes.io/projected/9f2d191f-6b44-4c22-bddd-53bd6237ba29-kube-api-access-rw9vh\") pod \"frr-k8s-webhook-server-64bf5d555-pcrl5\" (UID: \"9f2d191f-6b44-4c22-bddd-53bd6237ba29\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.767940 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd69h\" (UniqueName: \"kubernetes.io/projected/619837d2-8e2d-42d7-a34d-a3c1e39d213b-kube-api-access-wd69h\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.768787 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-metrics-certs\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.768911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ktj9\" (UniqueName: \"kubernetes.io/projected/8309be58-c242-4799-a7db-ebb0171b23de-kube-api-access-6ktj9\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.769252 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-cert\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.769361 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.769440 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metrics-certs\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: E1006 12:21:44.769541 4892 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Oct 06 12:21:44 crc kubenswrapper[4892]: E1006 12:21:44.769612 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metrics-certs podName:619837d2-8e2d-42d7-a34d-a3c1e39d213b nodeName:}" failed. No retries permitted until 2025-10-06 12:21:45.26959237 +0000 UTC m=+791.819298135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metrics-certs") pod "speaker-ffpm9" (UID: "619837d2-8e2d-42d7-a34d-a3c1e39d213b") : secret "speaker-certs-secret" not found Oct 06 12:21:44 crc kubenswrapper[4892]: E1006 12:21:44.769619 4892 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.769655 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metallb-excludel2\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: E1006 12:21:44.769698 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-metrics-certs podName:8309be58-c242-4799-a7db-ebb0171b23de nodeName:}" failed. No retries permitted until 2025-10-06 12:21:45.269671022 +0000 UTC m=+791.819376787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-metrics-certs") pod "controller-68d546b9d8-xnc9c" (UID: "8309be58-c242-4799-a7db-ebb0171b23de") : secret "controller-certs-secret" not found Oct 06 12:21:44 crc kubenswrapper[4892]: E1006 12:21:44.769742 4892 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 12:21:44 crc kubenswrapper[4892]: E1006 12:21:44.769778 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist podName:619837d2-8e2d-42d7-a34d-a3c1e39d213b nodeName:}" failed. No retries permitted until 2025-10-06 12:21:45.269768765 +0000 UTC m=+791.819474530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist") pod "speaker-ffpm9" (UID: "619837d2-8e2d-42d7-a34d-a3c1e39d213b") : secret "metallb-memberlist" not found Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.779147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metallb-excludel2\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.779627 4892 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.787876 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-cert\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.792092 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ktj9\" (UniqueName: \"kubernetes.io/projected/8309be58-c242-4799-a7db-ebb0171b23de-kube-api-access-6ktj9\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.799196 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd69h\" (UniqueName: \"kubernetes.io/projected/619837d2-8e2d-42d7-a34d-a3c1e39d213b-kube-api-access-wd69h\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:44 crc kubenswrapper[4892]: I1006 12:21:44.825071 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.175510 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/344945cb-67e7-4600-a300-676dcddc3659-metrics-certs\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.182057 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/344945cb-67e7-4600-a300-676dcddc3659-metrics-certs\") pod \"frr-k8s-zqfvv\" (UID: \"344945cb-67e7-4600-a300-676dcddc3659\") " pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.240765 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5"] Oct 06 12:21:45 crc kubenswrapper[4892]: W1006 12:21:45.245496 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f2d191f_6b44_4c22_bddd_53bd6237ba29.slice/crio-25e38fe293af64890633178c34f6726366fedb26d83d4d507cca9f153ca0be6b WatchSource:0}: Error finding container 25e38fe293af64890633178c34f6726366fedb26d83d4d507cca9f153ca0be6b: Status 404 returned error can't find the container with id 25e38fe293af64890633178c34f6726366fedb26d83d4d507cca9f153ca0be6b Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.277069 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-metrics-certs\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.277290 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.277440 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metrics-certs\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:45 crc kubenswrapper[4892]: E1006 12:21:45.277544 4892 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 06 12:21:45 crc kubenswrapper[4892]: E1006 12:21:45.277654 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist podName:619837d2-8e2d-42d7-a34d-a3c1e39d213b nodeName:}" failed. No retries permitted until 2025-10-06 12:21:46.277628047 +0000 UTC m=+792.827333842 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist") pod "speaker-ffpm9" (UID: "619837d2-8e2d-42d7-a34d-a3c1e39d213b") : secret "metallb-memberlist" not found Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.282583 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8309be58-c242-4799-a7db-ebb0171b23de-metrics-certs\") pod \"controller-68d546b9d8-xnc9c\" (UID: \"8309be58-c242-4799-a7db-ebb0171b23de\") " pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.286520 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-metrics-certs\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.413518 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.529391 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.721799 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" event={"ID":"9f2d191f-6b44-4c22-bddd-53bd6237ba29","Type":"ContainerStarted","Data":"25e38fe293af64890633178c34f6726366fedb26d83d4d507cca9f153ca0be6b"} Oct 06 12:21:45 crc kubenswrapper[4892]: I1006 12:21:45.722822 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerStarted","Data":"71a2a50a01f0b389150d65ba0f921e012bc719edcc1547ac732b1226cd4222d6"} Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.008449 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-xnc9c"] Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.292278 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.300400 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/619837d2-8e2d-42d7-a34d-a3c1e39d213b-memberlist\") pod \"speaker-ffpm9\" (UID: \"619837d2-8e2d-42d7-a34d-a3c1e39d213b\") " pod="metallb-system/speaker-ffpm9" Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.417186 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-ffpm9" Oct 06 12:21:46 crc kubenswrapper[4892]: W1006 12:21:46.457411 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod619837d2_8e2d_42d7_a34d_a3c1e39d213b.slice/crio-a122787eba49f97805e3e9cc0ee43a6b9c1708691334149c10512f5230f37804 WatchSource:0}: Error finding container a122787eba49f97805e3e9cc0ee43a6b9c1708691334149c10512f5230f37804: Status 404 returned error can't find the container with id a122787eba49f97805e3e9cc0ee43a6b9c1708691334149c10512f5230f37804 Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.731153 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xnc9c" event={"ID":"8309be58-c242-4799-a7db-ebb0171b23de","Type":"ContainerStarted","Data":"69100173baf08a4ab9ed94290756cd747b3cc54247cb408dff8877a4220b0ebd"} Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.731499 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xnc9c" event={"ID":"8309be58-c242-4799-a7db-ebb0171b23de","Type":"ContainerStarted","Data":"d77e05118263a4a3ec894ff4425f85c4f0a48db3190d185c5288ab4f60f38f05"} Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.731513 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-xnc9c" event={"ID":"8309be58-c242-4799-a7db-ebb0171b23de","Type":"ContainerStarted","Data":"68465abc85fd1a1ef495726c076e3d86a0b6d412597ddc11cae5129c193ef026"} Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.731554 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.733807 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ffpm9" event={"ID":"619837d2-8e2d-42d7-a34d-a3c1e39d213b","Type":"ContainerStarted","Data":"a122787eba49f97805e3e9cc0ee43a6b9c1708691334149c10512f5230f37804"} Oct 06 12:21:46 crc kubenswrapper[4892]: I1006 12:21:46.751175 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-xnc9c" podStartSLOduration=2.751149132 podStartE2EDuration="2.751149132s" podCreationTimestamp="2025-10-06 12:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:21:46.749800433 +0000 UTC m=+793.299506268" watchObservedRunningTime="2025-10-06 12:21:46.751149132 +0000 UTC m=+793.300854927" Oct 06 12:21:47 crc kubenswrapper[4892]: I1006 12:21:47.741488 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ffpm9" event={"ID":"619837d2-8e2d-42d7-a34d-a3c1e39d213b","Type":"ContainerStarted","Data":"3a22eaf26894df448a3503f5ab03c29264836c05d048c15551a9f05d5be2a647"} Oct 06 12:21:47 crc kubenswrapper[4892]: I1006 12:21:47.741557 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-ffpm9" event={"ID":"619837d2-8e2d-42d7-a34d-a3c1e39d213b","Type":"ContainerStarted","Data":"50e3a0390b21cb643528abb337c3d463f5baeeb50bb3f184df23cd860852787c"} Oct 06 12:21:47 crc kubenswrapper[4892]: I1006 12:21:47.741580 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-ffpm9" Oct 06 12:21:47 crc kubenswrapper[4892]: I1006 12:21:47.758634 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-ffpm9" podStartSLOduration=3.75861836 podStartE2EDuration="3.75861836s" podCreationTimestamp="2025-10-06 12:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:21:47.757973591 +0000 UTC m=+794.307679356" watchObservedRunningTime="2025-10-06 12:21:47.75861836 +0000 UTC m=+794.308324125" Oct 06 12:21:50 crc kubenswrapper[4892]: I1006 12:21:50.108834 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:50 crc kubenswrapper[4892]: I1006 12:21:50.109643 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:50 crc kubenswrapper[4892]: I1006 12:21:50.166069 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:50 crc kubenswrapper[4892]: I1006 12:21:50.821901 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:51 crc kubenswrapper[4892]: I1006 12:21:51.550767 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt44c"] Oct 06 12:21:52 crc kubenswrapper[4892]: I1006 12:21:52.784684 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dt44c" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerName="registry-server" containerID="cri-o://2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c" gracePeriod=2 Oct 06 12:21:52 crc kubenswrapper[4892]: I1006 12:21:52.984452 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:21:52 crc kubenswrapper[4892]: I1006 12:21:52.984874 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.248613 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.391064 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-catalog-content\") pod \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.391188 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-utilities\") pod \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.391228 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cd8rm\" (UniqueName: \"kubernetes.io/projected/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-kube-api-access-cd8rm\") pod \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\" (UID: \"f2522d11-54ae-40b7-83d7-ac88b19a9bb9\") " Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.392565 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-utilities" (OuterVolumeSpecName: "utilities") pod "f2522d11-54ae-40b7-83d7-ac88b19a9bb9" (UID: "f2522d11-54ae-40b7-83d7-ac88b19a9bb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.397796 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-kube-api-access-cd8rm" (OuterVolumeSpecName: "kube-api-access-cd8rm") pod "f2522d11-54ae-40b7-83d7-ac88b19a9bb9" (UID: "f2522d11-54ae-40b7-83d7-ac88b19a9bb9"). InnerVolumeSpecName "kube-api-access-cd8rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.406334 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2522d11-54ae-40b7-83d7-ac88b19a9bb9" (UID: "f2522d11-54ae-40b7-83d7-ac88b19a9bb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.493003 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.493058 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cd8rm\" (UniqueName: \"kubernetes.io/projected/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-kube-api-access-cd8rm\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.493082 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2522d11-54ae-40b7-83d7-ac88b19a9bb9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.795516 4892 generic.go:334] "Generic (PLEG): container finished" podID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerID="2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c" exitCode=0 Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.795578 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt44c" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.795599 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt44c" event={"ID":"f2522d11-54ae-40b7-83d7-ac88b19a9bb9","Type":"ContainerDied","Data":"2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c"} Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.796080 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt44c" event={"ID":"f2522d11-54ae-40b7-83d7-ac88b19a9bb9","Type":"ContainerDied","Data":"324c72b837213e142706d81f5b76bdcbc76df46b25f55dc05139879a8ae4326b"} Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.796129 4892 scope.go:117] "RemoveContainer" containerID="2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.798630 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" event={"ID":"9f2d191f-6b44-4c22-bddd-53bd6237ba29","Type":"ContainerStarted","Data":"2882607268f64e6c4869fe57bc7a61b27c71da1c03d43c926cb89f59424b446d"} Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.799033 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.805222 4892 generic.go:334] "Generic (PLEG): container finished" podID="344945cb-67e7-4600-a300-676dcddc3659" containerID="331253efcb0b645b871ebbcb91d66644caaa4e9bf931702e0ae2e92d61b1b71e" exitCode=0 Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.805296 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerDied","Data":"331253efcb0b645b871ebbcb91d66644caaa4e9bf931702e0ae2e92d61b1b71e"} Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.827914 4892 scope.go:117] "RemoveContainer" containerID="0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.830213 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" podStartSLOduration=2.217035078 podStartE2EDuration="9.830196507s" podCreationTimestamp="2025-10-06 12:21:44 +0000 UTC" firstStartedPulling="2025-10-06 12:21:45.24801607 +0000 UTC m=+791.797721835" lastFinishedPulling="2025-10-06 12:21:52.861177489 +0000 UTC m=+799.410883264" observedRunningTime="2025-10-06 12:21:53.828214591 +0000 UTC m=+800.377920376" watchObservedRunningTime="2025-10-06 12:21:53.830196507 +0000 UTC m=+800.379902272" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.865863 4892 scope.go:117] "RemoveContainer" containerID="ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.901708 4892 scope.go:117] "RemoveContainer" containerID="2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.903078 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt44c"] Oct 06 12:21:53 crc kubenswrapper[4892]: E1006 12:21:53.906793 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c\": container with ID starting with 2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c not found: ID does not exist" containerID="2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.906835 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c"} err="failed to get container status \"2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c\": rpc error: code = NotFound desc = could not find container \"2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c\": container with ID starting with 2632d810caa384f2a2ddd37bc45f7d4d60c7fbcff1026aae78f0deec1c1d182c not found: ID does not exist" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.906863 4892 scope.go:117] "RemoveContainer" containerID="0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d" Oct 06 12:21:53 crc kubenswrapper[4892]: E1006 12:21:53.907163 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d\": container with ID starting with 0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d not found: ID does not exist" containerID="0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.907209 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d"} err="failed to get container status \"0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d\": rpc error: code = NotFound desc = could not find container \"0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d\": container with ID starting with 0aed751ff76dd45b5c29d372c03a6d56ff52bcd3edb304b48753f30a34e3384d not found: ID does not exist" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.907240 4892 scope.go:117] "RemoveContainer" containerID="ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb" Oct 06 12:21:53 crc kubenswrapper[4892]: E1006 12:21:53.907747 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb\": container with ID starting with ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb not found: ID does not exist" containerID="ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.907775 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb"} err="failed to get container status \"ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb\": rpc error: code = NotFound desc = could not find container \"ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb\": container with ID starting with ef77c51a51f78b1daa6939012fa39f8526e070b03955ea2addebb4b700490bcb not found: ID does not exist" Oct 06 12:21:53 crc kubenswrapper[4892]: I1006 12:21:53.908670 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt44c"] Oct 06 12:21:54 crc kubenswrapper[4892]: I1006 12:21:54.180832 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" path="/var/lib/kubelet/pods/f2522d11-54ae-40b7-83d7-ac88b19a9bb9/volumes" Oct 06 12:21:54 crc kubenswrapper[4892]: I1006 12:21:54.819263 4892 generic.go:334] "Generic (PLEG): container finished" podID="344945cb-67e7-4600-a300-676dcddc3659" containerID="df69987863435635149b5322e5129c5aebb0dc9747c263afb8308bebbf83e66a" exitCode=0 Oct 06 12:21:54 crc kubenswrapper[4892]: I1006 12:21:54.819371 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerDied","Data":"df69987863435635149b5322e5129c5aebb0dc9747c263afb8308bebbf83e66a"} Oct 06 12:21:55 crc kubenswrapper[4892]: I1006 12:21:55.838987 4892 generic.go:334] "Generic (PLEG): container finished" podID="344945cb-67e7-4600-a300-676dcddc3659" containerID="05992f2081b2fa6aee1b09fed1f774938d273670643d2900f4093cb17420d8c8" exitCode=0 Oct 06 12:21:55 crc kubenswrapper[4892]: I1006 12:21:55.839050 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerDied","Data":"05992f2081b2fa6aee1b09fed1f774938d273670643d2900f4093cb17420d8c8"} Oct 06 12:21:56 crc kubenswrapper[4892]: I1006 12:21:56.421238 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-ffpm9" Oct 06 12:21:56 crc kubenswrapper[4892]: I1006 12:21:56.855301 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerStarted","Data":"dabb98825c8e66ca4a6f5294f56006adb0b8f261106b290df6f722123fa51729"} Oct 06 12:21:56 crc kubenswrapper[4892]: I1006 12:21:56.856507 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerStarted","Data":"2347ed694b7ede06fcc8da1688b30f1bb8e122848ff10a43afd628cdac7c5c77"} Oct 06 12:21:56 crc kubenswrapper[4892]: I1006 12:21:56.856558 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerStarted","Data":"052ab44579db51fbcd504481eb9afcbbd9a185ab8f9240bd51ba0dc0f4a8a856"} Oct 06 12:21:56 crc kubenswrapper[4892]: I1006 12:21:56.856571 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerStarted","Data":"28ebd8c8b8556975abc41d93f53eb2f4241f1619d5b7b2a3871c9518dcbe7ffd"} Oct 06 12:21:56 crc kubenswrapper[4892]: I1006 12:21:56.856583 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerStarted","Data":"330f0f14bf3074c929f8484863cf063b54cda88235ffc61a2ec2f1cf4a3559b5"} Oct 06 12:21:57 crc kubenswrapper[4892]: I1006 12:21:57.879304 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-zqfvv" event={"ID":"344945cb-67e7-4600-a300-676dcddc3659","Type":"ContainerStarted","Data":"4b494851e3a53e03aaa58047b29e0cda8082681b4628b64d8d552f04e1515e05"} Oct 06 12:21:57 crc kubenswrapper[4892]: I1006 12:21:57.879818 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:21:57 crc kubenswrapper[4892]: I1006 12:21:57.921206 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-zqfvv" podStartSLOduration=6.602020163 podStartE2EDuration="13.921167849s" podCreationTimestamp="2025-10-06 12:21:44 +0000 UTC" firstStartedPulling="2025-10-06 12:21:45.580391971 +0000 UTC m=+792.130097746" lastFinishedPulling="2025-10-06 12:21:52.899539637 +0000 UTC m=+799.449245432" observedRunningTime="2025-10-06 12:21:57.917419792 +0000 UTC m=+804.467125577" watchObservedRunningTime="2025-10-06 12:21:57.921167849 +0000 UTC m=+804.470873644" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.422138 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-m4lqh"] Oct 06 12:21:59 crc kubenswrapper[4892]: E1006 12:21:59.422850 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerName="extract-content" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.422872 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerName="extract-content" Oct 06 12:21:59 crc kubenswrapper[4892]: E1006 12:21:59.422910 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerName="registry-server" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.422923 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerName="registry-server" Oct 06 12:21:59 crc kubenswrapper[4892]: E1006 12:21:59.422946 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerName="extract-utilities" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.422959 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerName="extract-utilities" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.423159 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2522d11-54ae-40b7-83d7-ac88b19a9bb9" containerName="registry-server" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.423850 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m4lqh" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.429013 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-l9hrj" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.429077 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.429580 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.442252 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m4lqh"] Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.583305 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vgd\" (UniqueName: \"kubernetes.io/projected/4ade8d7c-b35d-4458-ab63-7cce4f4e7b50-kube-api-access-49vgd\") pod \"openstack-operator-index-m4lqh\" (UID: \"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50\") " pod="openstack-operators/openstack-operator-index-m4lqh" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.684521 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49vgd\" (UniqueName: \"kubernetes.io/projected/4ade8d7c-b35d-4458-ab63-7cce4f4e7b50-kube-api-access-49vgd\") pod \"openstack-operator-index-m4lqh\" (UID: \"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50\") " pod="openstack-operators/openstack-operator-index-m4lqh" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.711217 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vgd\" (UniqueName: \"kubernetes.io/projected/4ade8d7c-b35d-4458-ab63-7cce4f4e7b50-kube-api-access-49vgd\") pod \"openstack-operator-index-m4lqh\" (UID: \"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50\") " pod="openstack-operators/openstack-operator-index-m4lqh" Oct 06 12:21:59 crc kubenswrapper[4892]: I1006 12:21:59.755515 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m4lqh" Oct 06 12:22:00 crc kubenswrapper[4892]: I1006 12:22:00.184260 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-m4lqh"] Oct 06 12:22:00 crc kubenswrapper[4892]: W1006 12:22:00.184661 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ade8d7c_b35d_4458_ab63_7cce4f4e7b50.slice/crio-3540c331fece21d6641740532805caff51f4a75fb8043172ec4e69cea77ffebb WatchSource:0}: Error finding container 3540c331fece21d6641740532805caff51f4a75fb8043172ec4e69cea77ffebb: Status 404 returned error can't find the container with id 3540c331fece21d6641740532805caff51f4a75fb8043172ec4e69cea77ffebb Oct 06 12:22:00 crc kubenswrapper[4892]: I1006 12:22:00.414743 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:22:00 crc kubenswrapper[4892]: I1006 12:22:00.453143 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:22:00 crc kubenswrapper[4892]: I1006 12:22:00.916578 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m4lqh" event={"ID":"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50","Type":"ContainerStarted","Data":"3540c331fece21d6641740532805caff51f4a75fb8043172ec4e69cea77ffebb"} Oct 06 12:22:02 crc kubenswrapper[4892]: I1006 12:22:02.772868 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-m4lqh"] Oct 06 12:22:02 crc kubenswrapper[4892]: I1006 12:22:02.938589 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m4lqh" event={"ID":"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50","Type":"ContainerStarted","Data":"e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1"} Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.377052 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-m4lqh" podStartSLOduration=2.140374536 podStartE2EDuration="4.377015617s" podCreationTimestamp="2025-10-06 12:21:59 +0000 UTC" firstStartedPulling="2025-10-06 12:22:00.187410157 +0000 UTC m=+806.737115932" lastFinishedPulling="2025-10-06 12:22:02.424051248 +0000 UTC m=+808.973757013" observedRunningTime="2025-10-06 12:22:02.95985231 +0000 UTC m=+809.509558115" watchObservedRunningTime="2025-10-06 12:22:03.377015617 +0000 UTC m=+809.926721422" Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.379745 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kdnq4"] Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.381383 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.395704 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kdnq4"] Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.541377 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwgzj\" (UniqueName: \"kubernetes.io/projected/21f6370c-bc30-4e85-866b-d15376b5d6c8-kube-api-access-dwgzj\") pod \"openstack-operator-index-kdnq4\" (UID: \"21f6370c-bc30-4e85-866b-d15376b5d6c8\") " pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.643415 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwgzj\" (UniqueName: \"kubernetes.io/projected/21f6370c-bc30-4e85-866b-d15376b5d6c8-kube-api-access-dwgzj\") pod \"openstack-operator-index-kdnq4\" (UID: \"21f6370c-bc30-4e85-866b-d15376b5d6c8\") " pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.678506 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwgzj\" (UniqueName: \"kubernetes.io/projected/21f6370c-bc30-4e85-866b-d15376b5d6c8-kube-api-access-dwgzj\") pod \"openstack-operator-index-kdnq4\" (UID: \"21f6370c-bc30-4e85-866b-d15376b5d6c8\") " pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.720278 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.947599 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-m4lqh" podUID="4ade8d7c-b35d-4458-ab63-7cce4f4e7b50" containerName="registry-server" containerID="cri-o://e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1" gracePeriod=2 Oct 06 12:22:03 crc kubenswrapper[4892]: I1006 12:22:03.971829 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kdnq4"] Oct 06 12:22:03 crc kubenswrapper[4892]: W1006 12:22:03.980138 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21f6370c_bc30_4e85_866b_d15376b5d6c8.slice/crio-4f7ea5bddb32a078226461aa1b7f6d0e4936d4e8e89c696b7fb5810b2c369891 WatchSource:0}: Error finding container 4f7ea5bddb32a078226461aa1b7f6d0e4936d4e8e89c696b7fb5810b2c369891: Status 404 returned error can't find the container with id 4f7ea5bddb32a078226461aa1b7f6d0e4936d4e8e89c696b7fb5810b2c369891 Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.333470 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m4lqh" Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.460457 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49vgd\" (UniqueName: \"kubernetes.io/projected/4ade8d7c-b35d-4458-ab63-7cce4f4e7b50-kube-api-access-49vgd\") pod \"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50\" (UID: \"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50\") " Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.468358 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ade8d7c-b35d-4458-ab63-7cce4f4e7b50-kube-api-access-49vgd" (OuterVolumeSpecName: "kube-api-access-49vgd") pod "4ade8d7c-b35d-4458-ab63-7cce4f4e7b50" (UID: "4ade8d7c-b35d-4458-ab63-7cce4f4e7b50"). InnerVolumeSpecName "kube-api-access-49vgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.562094 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49vgd\" (UniqueName: \"kubernetes.io/projected/4ade8d7c-b35d-4458-ab63-7cce4f4e7b50-kube-api-access-49vgd\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.835788 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-pcrl5" Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.969886 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kdnq4" event={"ID":"21f6370c-bc30-4e85-866b-d15376b5d6c8","Type":"ContainerStarted","Data":"ddfce718b6060b09ee1175dee179961f60a75d9d1e1a3adcb687b020388fd9d7"} Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.969960 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kdnq4" event={"ID":"21f6370c-bc30-4e85-866b-d15376b5d6c8","Type":"ContainerStarted","Data":"4f7ea5bddb32a078226461aa1b7f6d0e4936d4e8e89c696b7fb5810b2c369891"} Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.973813 4892 generic.go:334] "Generic (PLEG): container finished" podID="4ade8d7c-b35d-4458-ab63-7cce4f4e7b50" containerID="e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1" exitCode=0 Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.973870 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m4lqh" event={"ID":"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50","Type":"ContainerDied","Data":"e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1"} Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.973903 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-m4lqh" event={"ID":"4ade8d7c-b35d-4458-ab63-7cce4f4e7b50","Type":"ContainerDied","Data":"3540c331fece21d6641740532805caff51f4a75fb8043172ec4e69cea77ffebb"} Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.973932 4892 scope.go:117] "RemoveContainer" containerID="e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1" Oct 06 12:22:04 crc kubenswrapper[4892]: I1006 12:22:04.974094 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-m4lqh" Oct 06 12:22:05 crc kubenswrapper[4892]: I1006 12:22:05.000441 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kdnq4" podStartSLOduration=1.947419193 podStartE2EDuration="2.000257385s" podCreationTimestamp="2025-10-06 12:22:03 +0000 UTC" firstStartedPulling="2025-10-06 12:22:03.985451168 +0000 UTC m=+810.535156943" lastFinishedPulling="2025-10-06 12:22:04.03828933 +0000 UTC m=+810.587995135" observedRunningTime="2025-10-06 12:22:04.985902955 +0000 UTC m=+811.535608750" watchObservedRunningTime="2025-10-06 12:22:05.000257385 +0000 UTC m=+811.549963190" Oct 06 12:22:05 crc kubenswrapper[4892]: I1006 12:22:05.008957 4892 scope.go:117] "RemoveContainer" containerID="e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1" Oct 06 12:22:05 crc kubenswrapper[4892]: E1006 12:22:05.009989 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1\": container with ID starting with e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1 not found: ID does not exist" containerID="e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1" Oct 06 12:22:05 crc kubenswrapper[4892]: I1006 12:22:05.010037 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1"} err="failed to get container status \"e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1\": rpc error: code = NotFound desc = could not find container \"e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1\": container with ID starting with e62fe539e52725f1bd025c1d2e9787a52150be67380ed336429cb8c02aad29b1 not found: ID does not exist" Oct 06 12:22:05 crc kubenswrapper[4892]: I1006 12:22:05.022816 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-m4lqh"] Oct 06 12:22:05 crc kubenswrapper[4892]: I1006 12:22:05.027122 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-m4lqh"] Oct 06 12:22:05 crc kubenswrapper[4892]: I1006 12:22:05.420662 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-zqfvv" Oct 06 12:22:05 crc kubenswrapper[4892]: I1006 12:22:05.534234 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-xnc9c" Oct 06 12:22:06 crc kubenswrapper[4892]: I1006 12:22:06.183404 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ade8d7c-b35d-4458-ab63-7cce4f4e7b50" path="/var/lib/kubelet/pods/4ade8d7c-b35d-4458-ab63-7cce4f4e7b50/volumes" Oct 06 12:22:06 crc kubenswrapper[4892]: I1006 12:22:06.986690 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-48mw2"] Oct 06 12:22:06 crc kubenswrapper[4892]: E1006 12:22:06.987231 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ade8d7c-b35d-4458-ab63-7cce4f4e7b50" containerName="registry-server" Oct 06 12:22:06 crc kubenswrapper[4892]: I1006 12:22:06.987270 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ade8d7c-b35d-4458-ab63-7cce4f4e7b50" containerName="registry-server" Oct 06 12:22:06 crc kubenswrapper[4892]: I1006 12:22:06.987645 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ade8d7c-b35d-4458-ab63-7cce4f4e7b50" containerName="registry-server" Oct 06 12:22:06 crc kubenswrapper[4892]: I1006 12:22:06.989658 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.005903 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48mw2"] Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.099556 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-catalog-content\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.099683 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5d25\" (UniqueName: \"kubernetes.io/projected/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-kube-api-access-c5d25\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.099726 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-utilities\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.201216 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5d25\" (UniqueName: \"kubernetes.io/projected/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-kube-api-access-c5d25\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.201277 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-utilities\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.201365 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-catalog-content\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.201994 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-catalog-content\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.202089 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-utilities\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.225602 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5d25\" (UniqueName: \"kubernetes.io/projected/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-kube-api-access-c5d25\") pod \"certified-operators-48mw2\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.310905 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:07 crc kubenswrapper[4892]: I1006 12:22:07.796632 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48mw2"] Oct 06 12:22:07 crc kubenswrapper[4892]: W1006 12:22:07.805197 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79a252e2_e3cc_4c1a_912e_f947e74c6cf8.slice/crio-eef26edd8bb4b3902f283608ccaef87736934edeb3226b1a4648ede8f8c31b0d WatchSource:0}: Error finding container eef26edd8bb4b3902f283608ccaef87736934edeb3226b1a4648ede8f8c31b0d: Status 404 returned error can't find the container with id eef26edd8bb4b3902f283608ccaef87736934edeb3226b1a4648ede8f8c31b0d Oct 06 12:22:08 crc kubenswrapper[4892]: I1006 12:22:08.001513 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48mw2" event={"ID":"79a252e2-e3cc-4c1a-912e-f947e74c6cf8","Type":"ContainerStarted","Data":"d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0"} Oct 06 12:22:08 crc kubenswrapper[4892]: I1006 12:22:08.001577 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48mw2" event={"ID":"79a252e2-e3cc-4c1a-912e-f947e74c6cf8","Type":"ContainerStarted","Data":"eef26edd8bb4b3902f283608ccaef87736934edeb3226b1a4648ede8f8c31b0d"} Oct 06 12:22:09 crc kubenswrapper[4892]: I1006 12:22:09.014848 4892 generic.go:334] "Generic (PLEG): container finished" podID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerID="d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0" exitCode=0 Oct 06 12:22:09 crc kubenswrapper[4892]: I1006 12:22:09.014932 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48mw2" event={"ID":"79a252e2-e3cc-4c1a-912e-f947e74c6cf8","Type":"ContainerDied","Data":"d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0"} Oct 06 12:22:11 crc kubenswrapper[4892]: I1006 12:22:11.034642 4892 generic.go:334] "Generic (PLEG): container finished" podID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerID="f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6" exitCode=0 Oct 06 12:22:11 crc kubenswrapper[4892]: I1006 12:22:11.034754 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48mw2" event={"ID":"79a252e2-e3cc-4c1a-912e-f947e74c6cf8","Type":"ContainerDied","Data":"f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6"} Oct 06 12:22:12 crc kubenswrapper[4892]: I1006 12:22:12.048060 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48mw2" event={"ID":"79a252e2-e3cc-4c1a-912e-f947e74c6cf8","Type":"ContainerStarted","Data":"b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5"} Oct 06 12:22:12 crc kubenswrapper[4892]: I1006 12:22:12.077165 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-48mw2" podStartSLOduration=3.4814454120000002 podStartE2EDuration="6.077149439s" podCreationTimestamp="2025-10-06 12:22:06 +0000 UTC" firstStartedPulling="2025-10-06 12:22:09.017821477 +0000 UTC m=+815.567527282" lastFinishedPulling="2025-10-06 12:22:11.613525514 +0000 UTC m=+818.163231309" observedRunningTime="2025-10-06 12:22:12.074182344 +0000 UTC m=+818.623888119" watchObservedRunningTime="2025-10-06 12:22:12.077149439 +0000 UTC m=+818.626855214" Oct 06 12:22:13 crc kubenswrapper[4892]: I1006 12:22:13.720474 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:13 crc kubenswrapper[4892]: I1006 12:22:13.720633 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:13 crc kubenswrapper[4892]: I1006 12:22:13.749496 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:14 crc kubenswrapper[4892]: I1006 12:22:14.102053 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-kdnq4" Oct 06 12:22:17 crc kubenswrapper[4892]: I1006 12:22:17.311643 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:17 crc kubenswrapper[4892]: I1006 12:22:17.312101 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:17 crc kubenswrapper[4892]: I1006 12:22:17.381554 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:18 crc kubenswrapper[4892]: I1006 12:22:18.213342 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:19 crc kubenswrapper[4892]: I1006 12:22:19.774023 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48mw2"] Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.115627 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-48mw2" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerName="registry-server" containerID="cri-o://b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5" gracePeriod=2 Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.566129 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.632236 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-catalog-content\") pod \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.632301 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-utilities\") pod \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.632487 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5d25\" (UniqueName: \"kubernetes.io/projected/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-kube-api-access-c5d25\") pod \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\" (UID: \"79a252e2-e3cc-4c1a-912e-f947e74c6cf8\") " Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.635428 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-utilities" (OuterVolumeSpecName: "utilities") pod "79a252e2-e3cc-4c1a-912e-f947e74c6cf8" (UID: "79a252e2-e3cc-4c1a-912e-f947e74c6cf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.642450 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-kube-api-access-c5d25" (OuterVolumeSpecName: "kube-api-access-c5d25") pod "79a252e2-e3cc-4c1a-912e-f947e74c6cf8" (UID: "79a252e2-e3cc-4c1a-912e-f947e74c6cf8"). InnerVolumeSpecName "kube-api-access-c5d25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.702989 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79a252e2-e3cc-4c1a-912e-f947e74c6cf8" (UID: "79a252e2-e3cc-4c1a-912e-f947e74c6cf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.733834 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.734081 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:21 crc kubenswrapper[4892]: I1006 12:22:21.734108 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5d25\" (UniqueName: \"kubernetes.io/projected/79a252e2-e3cc-4c1a-912e-f947e74c6cf8-kube-api-access-c5d25\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.128490 4892 generic.go:334] "Generic (PLEG): container finished" podID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerID="b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5" exitCode=0 Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.128582 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48mw2" event={"ID":"79a252e2-e3cc-4c1a-912e-f947e74c6cf8","Type":"ContainerDied","Data":"b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5"} Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.128719 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48mw2" event={"ID":"79a252e2-e3cc-4c1a-912e-f947e74c6cf8","Type":"ContainerDied","Data":"eef26edd8bb4b3902f283608ccaef87736934edeb3226b1a4648ede8f8c31b0d"} Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.128756 4892 scope.go:117] "RemoveContainer" containerID="b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.128617 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48mw2" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.163613 4892 scope.go:117] "RemoveContainer" containerID="f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.191764 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48mw2"] Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.191846 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-48mw2"] Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.207861 4892 scope.go:117] "RemoveContainer" containerID="d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.235504 4892 scope.go:117] "RemoveContainer" containerID="b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5" Oct 06 12:22:22 crc kubenswrapper[4892]: E1006 12:22:22.236063 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5\": container with ID starting with b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5 not found: ID does not exist" containerID="b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.236136 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5"} err="failed to get container status \"b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5\": rpc error: code = NotFound desc = could not find container \"b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5\": container with ID starting with b38b69fb3ac8c8251abbe307c5152bb621dbd2633e6efe42cc1fcd511f59e5b5 not found: ID does not exist" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.236193 4892 scope.go:117] "RemoveContainer" containerID="f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6" Oct 06 12:22:22 crc kubenswrapper[4892]: E1006 12:22:22.236733 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6\": container with ID starting with f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6 not found: ID does not exist" containerID="f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.236766 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6"} err="failed to get container status \"f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6\": rpc error: code = NotFound desc = could not find container \"f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6\": container with ID starting with f50a1270ef66ded509bded212838674f4a1351d608ed1a5b59a3f4f582223be6 not found: ID does not exist" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.236789 4892 scope.go:117] "RemoveContainer" containerID="d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0" Oct 06 12:22:22 crc kubenswrapper[4892]: E1006 12:22:22.237451 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0\": container with ID starting with d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0 not found: ID does not exist" containerID="d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.237496 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0"} err="failed to get container status \"d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0\": rpc error: code = NotFound desc = could not find container \"d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0\": container with ID starting with d00bcc5ae2a9ac9c53539d76307b082998cc5ab9a9f97a1969d497714399ade0 not found: ID does not exist" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.843858 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f"] Oct 06 12:22:22 crc kubenswrapper[4892]: E1006 12:22:22.844216 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerName="registry-server" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.844239 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerName="registry-server" Oct 06 12:22:22 crc kubenswrapper[4892]: E1006 12:22:22.844267 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerName="extract-content" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.844281 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerName="extract-content" Oct 06 12:22:22 crc kubenswrapper[4892]: E1006 12:22:22.844314 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerName="extract-utilities" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.844355 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerName="extract-utilities" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.844570 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" containerName="registry-server" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.846080 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.849011 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-ln5q5" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.854752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks4lc\" (UniqueName: \"kubernetes.io/projected/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-kube-api-access-ks4lc\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.854869 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-bundle\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.855027 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-util\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.869967 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f"] Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.956712 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-util\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.957103 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks4lc\" (UniqueName: \"kubernetes.io/projected/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-kube-api-access-ks4lc\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.957303 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-bundle\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.957650 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-util\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.957944 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-bundle\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.984992 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.985092 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:22:22 crc kubenswrapper[4892]: I1006 12:22:22.994365 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks4lc\" (UniqueName: \"kubernetes.io/projected/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-kube-api-access-ks4lc\") pod \"b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:23 crc kubenswrapper[4892]: I1006 12:22:23.177047 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:23 crc kubenswrapper[4892]: I1006 12:22:23.683822 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f"] Oct 06 12:22:24 crc kubenswrapper[4892]: I1006 12:22:24.150766 4892 generic.go:334] "Generic (PLEG): container finished" podID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerID="fb826b9ac75aedefc635889c6e436f5831762466a44f9b0e4385af417c6f89ee" exitCode=0 Oct 06 12:22:24 crc kubenswrapper[4892]: I1006 12:22:24.150821 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" event={"ID":"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1","Type":"ContainerDied","Data":"fb826b9ac75aedefc635889c6e436f5831762466a44f9b0e4385af417c6f89ee"} Oct 06 12:22:24 crc kubenswrapper[4892]: I1006 12:22:24.151292 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" event={"ID":"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1","Type":"ContainerStarted","Data":"6404dfe959c8609fdbea7501a4ea3ec6f4da5d3e869823d00c45ff63b26ea4b9"} Oct 06 12:22:24 crc kubenswrapper[4892]: I1006 12:22:24.189235 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79a252e2-e3cc-4c1a-912e-f947e74c6cf8" path="/var/lib/kubelet/pods/79a252e2-e3cc-4c1a-912e-f947e74c6cf8/volumes" Oct 06 12:22:24 crc kubenswrapper[4892]: I1006 12:22:24.989625 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5chdl"] Oct 06 12:22:24 crc kubenswrapper[4892]: I1006 12:22:24.991607 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.009867 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5chdl"] Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.095195 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bkkw\" (UniqueName: \"kubernetes.io/projected/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-kube-api-access-8bkkw\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.095521 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-utilities\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.095592 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-catalog-content\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.161179 4892 generic.go:334] "Generic (PLEG): container finished" podID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerID="fd6cce5190120ad26f4719490ab2b9f49020ee6e17bf161af06cde584740092c" exitCode=0 Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.161289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" event={"ID":"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1","Type":"ContainerDied","Data":"fd6cce5190120ad26f4719490ab2b9f49020ee6e17bf161af06cde584740092c"} Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.196287 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bkkw\" (UniqueName: \"kubernetes.io/projected/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-kube-api-access-8bkkw\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.196387 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-utilities\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.196438 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-catalog-content\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.196832 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-catalog-content\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.197093 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-utilities\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.219880 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bkkw\" (UniqueName: \"kubernetes.io/projected/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-kube-api-access-8bkkw\") pod \"community-operators-5chdl\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.322633 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:25 crc kubenswrapper[4892]: E1006 12:22:25.491827 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ee60ce0_cf65_4875_a1e5_fc3c4b3542d1.slice/crio-09a2c871f04ec83d1ff6b0e17a2c5a9c3ec087a12f6f0e877e6e01810087cfc1.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:22:25 crc kubenswrapper[4892]: I1006 12:22:25.624898 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5chdl"] Oct 06 12:22:26 crc kubenswrapper[4892]: I1006 12:22:26.170676 4892 generic.go:334] "Generic (PLEG): container finished" podID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerID="4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70" exitCode=0 Oct 06 12:22:26 crc kubenswrapper[4892]: I1006 12:22:26.174596 4892 generic.go:334] "Generic (PLEG): container finished" podID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerID="09a2c871f04ec83d1ff6b0e17a2c5a9c3ec087a12f6f0e877e6e01810087cfc1" exitCode=0 Oct 06 12:22:26 crc kubenswrapper[4892]: I1006 12:22:26.189601 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5chdl" event={"ID":"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7","Type":"ContainerDied","Data":"4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70"} Oct 06 12:22:26 crc kubenswrapper[4892]: I1006 12:22:26.189949 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5chdl" event={"ID":"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7","Type":"ContainerStarted","Data":"91b9487921af86b01d506abf9e51ddde6e1dbbaafa29169220fe935af3070e27"} Oct 06 12:22:26 crc kubenswrapper[4892]: I1006 12:22:26.190203 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" event={"ID":"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1","Type":"ContainerDied","Data":"09a2c871f04ec83d1ff6b0e17a2c5a9c3ec087a12f6f0e877e6e01810087cfc1"} Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.181479 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5chdl" event={"ID":"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7","Type":"ContainerStarted","Data":"9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d"} Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.528101 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.537909 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks4lc\" (UniqueName: \"kubernetes.io/projected/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-kube-api-access-ks4lc\") pod \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.537971 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-util\") pod \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.538047 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-bundle\") pod \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\" (UID: \"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1\") " Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.539367 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-bundle" (OuterVolumeSpecName: "bundle") pod "2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" (UID: "2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.548670 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-kube-api-access-ks4lc" (OuterVolumeSpecName: "kube-api-access-ks4lc") pod "2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" (UID: "2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1"). InnerVolumeSpecName "kube-api-access-ks4lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.575050 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-util" (OuterVolumeSpecName: "util") pod "2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" (UID: "2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.639892 4892 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.639935 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks4lc\" (UniqueName: \"kubernetes.io/projected/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-kube-api-access-ks4lc\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:27 crc kubenswrapper[4892]: I1006 12:22:27.639955 4892 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1-util\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:28 crc kubenswrapper[4892]: I1006 12:22:28.195020 4892 generic.go:334] "Generic (PLEG): container finished" podID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerID="9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d" exitCode=0 Oct 06 12:22:28 crc kubenswrapper[4892]: I1006 12:22:28.196737 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5chdl" event={"ID":"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7","Type":"ContainerDied","Data":"9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d"} Oct 06 12:22:28 crc kubenswrapper[4892]: I1006 12:22:28.201508 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" event={"ID":"2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1","Type":"ContainerDied","Data":"6404dfe959c8609fdbea7501a4ea3ec6f4da5d3e869823d00c45ff63b26ea4b9"} Oct 06 12:22:28 crc kubenswrapper[4892]: I1006 12:22:28.201568 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6404dfe959c8609fdbea7501a4ea3ec6f4da5d3e869823d00c45ff63b26ea4b9" Oct 06 12:22:28 crc kubenswrapper[4892]: I1006 12:22:28.201671 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f" Oct 06 12:22:29 crc kubenswrapper[4892]: I1006 12:22:29.219839 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5chdl" event={"ID":"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7","Type":"ContainerStarted","Data":"2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee"} Oct 06 12:22:29 crc kubenswrapper[4892]: I1006 12:22:29.260775 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5chdl" podStartSLOduration=2.733622242 podStartE2EDuration="5.260748406s" podCreationTimestamp="2025-10-06 12:22:24 +0000 UTC" firstStartedPulling="2025-10-06 12:22:26.172950258 +0000 UTC m=+832.722656033" lastFinishedPulling="2025-10-06 12:22:28.700076412 +0000 UTC m=+835.249782197" observedRunningTime="2025-10-06 12:22:29.251505911 +0000 UTC m=+835.801211736" watchObservedRunningTime="2025-10-06 12:22:29.260748406 +0000 UTC m=+835.810454181" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.761856 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-c85944558-7trks"] Oct 06 12:22:31 crc kubenswrapper[4892]: E1006 12:22:31.762443 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerName="pull" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.762471 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerName="pull" Oct 06 12:22:31 crc kubenswrapper[4892]: E1006 12:22:31.762487 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerName="util" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.762494 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerName="util" Oct 06 12:22:31 crc kubenswrapper[4892]: E1006 12:22:31.762511 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerName="extract" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.762518 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerName="extract" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.762652 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1" containerName="extract" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.763242 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.765985 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-qcpl2" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.791810 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-c85944558-7trks"] Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.811094 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5vxw\" (UniqueName: \"kubernetes.io/projected/883f9aa6-2f29-4f1b-b2f5-0581ab6853f9-kube-api-access-r5vxw\") pod \"openstack-operator-controller-operator-c85944558-7trks\" (UID: \"883f9aa6-2f29-4f1b-b2f5-0581ab6853f9\") " pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.912162 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5vxw\" (UniqueName: \"kubernetes.io/projected/883f9aa6-2f29-4f1b-b2f5-0581ab6853f9-kube-api-access-r5vxw\") pod \"openstack-operator-controller-operator-c85944558-7trks\" (UID: \"883f9aa6-2f29-4f1b-b2f5-0581ab6853f9\") " pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" Oct 06 12:22:31 crc kubenswrapper[4892]: I1006 12:22:31.940834 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5vxw\" (UniqueName: \"kubernetes.io/projected/883f9aa6-2f29-4f1b-b2f5-0581ab6853f9-kube-api-access-r5vxw\") pod \"openstack-operator-controller-operator-c85944558-7trks\" (UID: \"883f9aa6-2f29-4f1b-b2f5-0581ab6853f9\") " pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" Oct 06 12:22:32 crc kubenswrapper[4892]: I1006 12:22:32.096038 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" Oct 06 12:22:32 crc kubenswrapper[4892]: I1006 12:22:32.573831 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-c85944558-7trks"] Oct 06 12:22:33 crc kubenswrapper[4892]: I1006 12:22:33.258294 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" event={"ID":"883f9aa6-2f29-4f1b-b2f5-0581ab6853f9","Type":"ContainerStarted","Data":"218fdf81ab81d82a85451791ac2368a6f577ca759431ee15db105a93c3c28908"} Oct 06 12:22:35 crc kubenswrapper[4892]: I1006 12:22:35.323594 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:35 crc kubenswrapper[4892]: I1006 12:22:35.323867 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:35 crc kubenswrapper[4892]: I1006 12:22:35.389777 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:36 crc kubenswrapper[4892]: I1006 12:22:36.329098 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:37 crc kubenswrapper[4892]: I1006 12:22:37.171258 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5chdl"] Oct 06 12:22:38 crc kubenswrapper[4892]: I1006 12:22:38.289142 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" event={"ID":"883f9aa6-2f29-4f1b-b2f5-0581ab6853f9","Type":"ContainerStarted","Data":"2663a99e146124ae4cfad9af71fd9b288cce49ec0b1332b3ffcd6885ebe855f8"} Oct 06 12:22:38 crc kubenswrapper[4892]: I1006 12:22:38.289283 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5chdl" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerName="registry-server" containerID="cri-o://2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee" gracePeriod=2 Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.016430 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.120201 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-utilities\") pod \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.120396 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-catalog-content\") pod \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.120554 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bkkw\" (UniqueName: \"kubernetes.io/projected/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-kube-api-access-8bkkw\") pod \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\" (UID: \"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7\") " Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.120931 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-utilities" (OuterVolumeSpecName: "utilities") pod "2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" (UID: "2fb8c328-5db0-4014-8f0a-bace1e7ed8b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.121237 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.134638 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-kube-api-access-8bkkw" (OuterVolumeSpecName: "kube-api-access-8bkkw") pod "2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" (UID: "2fb8c328-5db0-4014-8f0a-bace1e7ed8b7"). InnerVolumeSpecName "kube-api-access-8bkkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.179684 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" (UID: "2fb8c328-5db0-4014-8f0a-bace1e7ed8b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.222738 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.222787 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bkkw\" (UniqueName: \"kubernetes.io/projected/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7-kube-api-access-8bkkw\") on node \"crc\" DevicePath \"\"" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.299183 4892 generic.go:334] "Generic (PLEG): container finished" podID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerID="2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee" exitCode=0 Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.299242 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5chdl" event={"ID":"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7","Type":"ContainerDied","Data":"2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee"} Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.299275 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5chdl" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.299293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5chdl" event={"ID":"2fb8c328-5db0-4014-8f0a-bace1e7ed8b7","Type":"ContainerDied","Data":"91b9487921af86b01d506abf9e51ddde6e1dbbaafa29169220fe935af3070e27"} Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.299349 4892 scope.go:117] "RemoveContainer" containerID="2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.343176 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5chdl"] Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.351466 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5chdl"] Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.604556 4892 scope.go:117] "RemoveContainer" containerID="9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.645548 4892 scope.go:117] "RemoveContainer" containerID="4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.689169 4892 scope.go:117] "RemoveContainer" containerID="2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee" Oct 06 12:22:39 crc kubenswrapper[4892]: E1006 12:22:39.689527 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee\": container with ID starting with 2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee not found: ID does not exist" containerID="2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.689562 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee"} err="failed to get container status \"2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee\": rpc error: code = NotFound desc = could not find container \"2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee\": container with ID starting with 2eda3a9c216f5c61580e6a850df580e86d0164c4b26b9a924ba16217b6898eee not found: ID does not exist" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.689588 4892 scope.go:117] "RemoveContainer" containerID="9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d" Oct 06 12:22:39 crc kubenswrapper[4892]: E1006 12:22:39.689893 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d\": container with ID starting with 9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d not found: ID does not exist" containerID="9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.689918 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d"} err="failed to get container status \"9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d\": rpc error: code = NotFound desc = could not find container \"9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d\": container with ID starting with 9cd3157fff7ba7e31c300b46d8157b05242b2e247a0b16c714edfee357b2790d not found: ID does not exist" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.689935 4892 scope.go:117] "RemoveContainer" containerID="4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70" Oct 06 12:22:39 crc kubenswrapper[4892]: E1006 12:22:39.690225 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70\": container with ID starting with 4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70 not found: ID does not exist" containerID="4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70" Oct 06 12:22:39 crc kubenswrapper[4892]: I1006 12:22:39.690249 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70"} err="failed to get container status \"4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70\": rpc error: code = NotFound desc = could not find container \"4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70\": container with ID starting with 4148ed923686e877d77325ceb87a5d5118118271f2dee5de3a0c5a320e53bf70 not found: ID does not exist" Oct 06 12:22:40 crc kubenswrapper[4892]: I1006 12:22:40.183507 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" path="/var/lib/kubelet/pods/2fb8c328-5db0-4014-8f0a-bace1e7ed8b7/volumes" Oct 06 12:22:40 crc kubenswrapper[4892]: I1006 12:22:40.310030 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" event={"ID":"883f9aa6-2f29-4f1b-b2f5-0581ab6853f9","Type":"ContainerStarted","Data":"f2b7fbfa935702c85d8aa9ba827fbec62c0b8240404cf156139907820df8fe0a"} Oct 06 12:22:40 crc kubenswrapper[4892]: I1006 12:22:40.310169 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" Oct 06 12:22:40 crc kubenswrapper[4892]: I1006 12:22:40.364444 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" podStartSLOduration=2.280635191 podStartE2EDuration="9.364414055s" podCreationTimestamp="2025-10-06 12:22:31 +0000 UTC" firstStartedPulling="2025-10-06 12:22:32.581614851 +0000 UTC m=+839.131320616" lastFinishedPulling="2025-10-06 12:22:39.665393705 +0000 UTC m=+846.215099480" observedRunningTime="2025-10-06 12:22:40.352683178 +0000 UTC m=+846.902388983" watchObservedRunningTime="2025-10-06 12:22:40.364414055 +0000 UTC m=+846.914119860" Oct 06 12:22:42 crc kubenswrapper[4892]: I1006 12:22:42.099564 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-c85944558-7trks" Oct 06 12:22:52 crc kubenswrapper[4892]: I1006 12:22:52.984745 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:22:52 crc kubenswrapper[4892]: I1006 12:22:52.985246 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:22:52 crc kubenswrapper[4892]: I1006 12:22:52.985295 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:22:52 crc kubenswrapper[4892]: I1006 12:22:52.985981 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65a23fa133935013fcfc189b72a8929bb8d601f4fefb20b891097d8ee152e268"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:22:52 crc kubenswrapper[4892]: I1006 12:22:52.986041 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://65a23fa133935013fcfc189b72a8929bb8d601f4fefb20b891097d8ee152e268" gracePeriod=600 Oct 06 12:22:53 crc kubenswrapper[4892]: I1006 12:22:53.418596 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="65a23fa133935013fcfc189b72a8929bb8d601f4fefb20b891097d8ee152e268" exitCode=0 Oct 06 12:22:53 crc kubenswrapper[4892]: I1006 12:22:53.418765 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"65a23fa133935013fcfc189b72a8929bb8d601f4fefb20b891097d8ee152e268"} Oct 06 12:22:53 crc kubenswrapper[4892]: I1006 12:22:53.418968 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"f99cab5f831d4479bae318ede8be6239cf73affb4f0ae80b3e22b31bc2f59223"} Oct 06 12:22:53 crc kubenswrapper[4892]: I1006 12:22:53.419002 4892 scope.go:117] "RemoveContainer" containerID="e79f1d5ccf3f44bb99369047594e47c87713adc95a8cf367cf3aea501eded4f0" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.821063 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7"] Oct 06 12:23:18 crc kubenswrapper[4892]: E1006 12:23:18.822772 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerName="extract-content" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.822813 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerName="extract-content" Oct 06 12:23:18 crc kubenswrapper[4892]: E1006 12:23:18.822857 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerName="extract-utilities" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.822866 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerName="extract-utilities" Oct 06 12:23:18 crc kubenswrapper[4892]: E1006 12:23:18.822907 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerName="registry-server" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.822915 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerName="registry-server" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.823544 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb8c328-5db0-4014-8f0a-bace1e7ed8b7" containerName="registry-server" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.825402 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.835268 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mqzvk" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.849412 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.851430 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.858716 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-n4kpj" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.866927 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.876452 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.877536 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.881950 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-c2zs8" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.893067 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.914389 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.925787 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.926868 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.929827 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.934960 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.935566 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-c4bb4" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.938672 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-cqznp" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.951247 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.952011 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdf9f\" (UniqueName: \"kubernetes.io/projected/1de023aa-81a7-4510-a6a3-93010ca572be-kube-api-access-xdf9f\") pod \"cinder-operator-controller-manager-7d4d4f8d-p2ncw\" (UID: \"1de023aa-81a7-4510-a6a3-93010ca572be\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.952135 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs87g\" (UniqueName: \"kubernetes.io/projected/4c2bced1-04b0-4525-b1d3-c3adc3669b68-kube-api-access-vs87g\") pod \"barbican-operator-controller-manager-58c4cd55f4-b6gs7\" (UID: \"4c2bced1-04b0-4525-b1d3-c3adc3669b68\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.952164 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j27x\" (UniqueName: \"kubernetes.io/projected/718cc1ac-8554-450f-bca5-5449909339dd-kube-api-access-5j27x\") pod \"designate-operator-controller-manager-75dfd9b554-68ldg\" (UID: \"718cc1ac-8554-450f-bca5-5449909339dd\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.955360 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.956366 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.958941 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vq7fd" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.964731 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.977907 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.993146 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w"] Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.994184 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.995681 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.995710 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9ssdq" Oct 06 12:23:18 crc kubenswrapper[4892]: I1006 12:23:18.999433 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.000646 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.003533 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-ptbr9" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.018485 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.029015 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.030177 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.032554 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-sxx9j" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.051490 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.052776 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.052890 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfjcc\" (UniqueName: \"kubernetes.io/projected/773fe79a-1318-46ee-87bc-99786396705c-kube-api-access-wfjcc\") pod \"glance-operator-controller-manager-5dc44df7d5-gtrv6\" (UID: \"773fe79a-1318-46ee-87bc-99786396705c\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.052948 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs87g\" (UniqueName: \"kubernetes.io/projected/4c2bced1-04b0-4525-b1d3-c3adc3669b68-kube-api-access-vs87g\") pod \"barbican-operator-controller-manager-58c4cd55f4-b6gs7\" (UID: \"4c2bced1-04b0-4525-b1d3-c3adc3669b68\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.052969 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j27x\" (UniqueName: \"kubernetes.io/projected/718cc1ac-8554-450f-bca5-5449909339dd-kube-api-access-5j27x\") pod \"designate-operator-controller-manager-75dfd9b554-68ldg\" (UID: \"718cc1ac-8554-450f-bca5-5449909339dd\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.052997 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ppvz\" (UniqueName: \"kubernetes.io/projected/3b478830-010a-408c-9eb4-0eaa51f75c31-kube-api-access-5ppvz\") pod \"horizon-operator-controller-manager-76d5b87f47-qdxqw\" (UID: \"3b478830-010a-408c-9eb4-0eaa51f75c31\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.053015 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g55wr\" (UniqueName: \"kubernetes.io/projected/5c85c0be-894b-4469-820c-35cac2b32905-kube-api-access-g55wr\") pod \"infra-operator-controller-manager-658588b8c9-bdx4w\" (UID: \"5c85c0be-894b-4469-820c-35cac2b32905\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.053036 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c85c0be-894b-4469-820c-35cac2b32905-cert\") pod \"infra-operator-controller-manager-658588b8c9-bdx4w\" (UID: \"5c85c0be-894b-4469-820c-35cac2b32905\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.053060 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmmnc\" (UniqueName: \"kubernetes.io/projected/ed10a250-1c0b-4fc4-9906-6e01dba78a1e-kube-api-access-jmmnc\") pod \"heat-operator-controller-manager-54b4974c45-s8hw9\" (UID: \"ed10a250-1c0b-4fc4-9906-6e01dba78a1e\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.053095 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdf9f\" (UniqueName: \"kubernetes.io/projected/1de023aa-81a7-4510-a6a3-93010ca572be-kube-api-access-xdf9f\") pod \"cinder-operator-controller-manager-7d4d4f8d-p2ncw\" (UID: \"1de023aa-81a7-4510-a6a3-93010ca572be\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.057119 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-24jr6" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.060431 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.075669 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.084886 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs87g\" (UniqueName: \"kubernetes.io/projected/4c2bced1-04b0-4525-b1d3-c3adc3669b68-kube-api-access-vs87g\") pod \"barbican-operator-controller-manager-58c4cd55f4-b6gs7\" (UID: \"4c2bced1-04b0-4525-b1d3-c3adc3669b68\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.084891 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdf9f\" (UniqueName: \"kubernetes.io/projected/1de023aa-81a7-4510-a6a3-93010ca572be-kube-api-access-xdf9f\") pod \"cinder-operator-controller-manager-7d4d4f8d-p2ncw\" (UID: \"1de023aa-81a7-4510-a6a3-93010ca572be\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.089277 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.091301 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j27x\" (UniqueName: \"kubernetes.io/projected/718cc1ac-8554-450f-bca5-5449909339dd-kube-api-access-5j27x\") pod \"designate-operator-controller-manager-75dfd9b554-68ldg\" (UID: \"718cc1ac-8554-450f-bca5-5449909339dd\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.102668 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.102787 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.104912 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-w224h" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.117282 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.127528 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.128651 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.130013 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-876q5" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.130151 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.140448 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.141684 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.145704 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-wg4f8" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.150548 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.151669 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.153266 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.161556 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-fqrzr" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.165839 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxtml\" (UniqueName: \"kubernetes.io/projected/7652cece-84f1-49ba-b99b-8e40047e7822-kube-api-access-kxtml\") pod \"manila-operator-controller-manager-65d89cfd9f-cfcl2\" (UID: \"7652cece-84f1-49ba-b99b-8e40047e7822\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.165887 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfjcc\" (UniqueName: \"kubernetes.io/projected/773fe79a-1318-46ee-87bc-99786396705c-kube-api-access-wfjcc\") pod \"glance-operator-controller-manager-5dc44df7d5-gtrv6\" (UID: \"773fe79a-1318-46ee-87bc-99786396705c\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.165911 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cwk4\" (UniqueName: \"kubernetes.io/projected/8031228c-d653-49ea-aa71-90709d299152-kube-api-access-4cwk4\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-s8527\" (UID: \"8031228c-d653-49ea-aa71-90709d299152\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.165939 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9hv\" (UniqueName: \"kubernetes.io/projected/8bba42dc-f728-4066-a83c-632c7dfd4502-kube-api-access-wg9hv\") pod \"ironic-operator-controller-manager-649675d675-h6gp4\" (UID: \"8bba42dc-f728-4066-a83c-632c7dfd4502\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.165985 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ppvz\" (UniqueName: \"kubernetes.io/projected/3b478830-010a-408c-9eb4-0eaa51f75c31-kube-api-access-5ppvz\") pod \"horizon-operator-controller-manager-76d5b87f47-qdxqw\" (UID: \"3b478830-010a-408c-9eb4-0eaa51f75c31\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.166006 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g55wr\" (UniqueName: \"kubernetes.io/projected/5c85c0be-894b-4469-820c-35cac2b32905-kube-api-access-g55wr\") pod \"infra-operator-controller-manager-658588b8c9-bdx4w\" (UID: \"5c85c0be-894b-4469-820c-35cac2b32905\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.166029 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c85c0be-894b-4469-820c-35cac2b32905-cert\") pod \"infra-operator-controller-manager-658588b8c9-bdx4w\" (UID: \"5c85c0be-894b-4469-820c-35cac2b32905\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.166057 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmnc\" (UniqueName: \"kubernetes.io/projected/ed10a250-1c0b-4fc4-9906-6e01dba78a1e-kube-api-access-jmmnc\") pod \"heat-operator-controller-manager-54b4974c45-s8hw9\" (UID: \"ed10a250-1c0b-4fc4-9906-6e01dba78a1e\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.166613 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z"] Oct 06 12:23:19 crc kubenswrapper[4892]: E1006 12:23:19.166809 4892 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 06 12:23:19 crc kubenswrapper[4892]: E1006 12:23:19.166853 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c85c0be-894b-4469-820c-35cac2b32905-cert podName:5c85c0be-894b-4469-820c-35cac2b32905 nodeName:}" failed. No retries permitted until 2025-10-06 12:23:19.666836671 +0000 UTC m=+886.216542436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5c85c0be-894b-4469-820c-35cac2b32905-cert") pod "infra-operator-controller-manager-658588b8c9-bdx4w" (UID: "5c85c0be-894b-4469-820c-35cac2b32905") : secret "infra-operator-webhook-server-cert" not found Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.177765 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.180682 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.200005 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.202403 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfjcc\" (UniqueName: \"kubernetes.io/projected/773fe79a-1318-46ee-87bc-99786396705c-kube-api-access-wfjcc\") pod \"glance-operator-controller-manager-5dc44df7d5-gtrv6\" (UID: \"773fe79a-1318-46ee-87bc-99786396705c\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.203750 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g55wr\" (UniqueName: \"kubernetes.io/projected/5c85c0be-894b-4469-820c-35cac2b32905-kube-api-access-g55wr\") pod \"infra-operator-controller-manager-658588b8c9-bdx4w\" (UID: \"5c85c0be-894b-4469-820c-35cac2b32905\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.206605 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmmnc\" (UniqueName: \"kubernetes.io/projected/ed10a250-1c0b-4fc4-9906-6e01dba78a1e-kube-api-access-jmmnc\") pod \"heat-operator-controller-manager-54b4974c45-s8hw9\" (UID: \"ed10a250-1c0b-4fc4-9906-6e01dba78a1e\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.219940 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.220894 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ppvz\" (UniqueName: \"kubernetes.io/projected/3b478830-010a-408c-9eb4-0eaa51f75c31-kube-api-access-5ppvz\") pod \"horizon-operator-controller-manager-76d5b87f47-qdxqw\" (UID: \"3b478830-010a-408c-9eb4-0eaa51f75c31\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.222875 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.224176 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.225693 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rkdwh" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.248589 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.251760 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.253196 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.255306 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-thfx5" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.272604 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9fvq\" (UniqueName: \"kubernetes.io/projected/d23bd511-3627-436d-bea3-abc434a7ecc7-kube-api-access-s9fvq\") pod \"neutron-operator-controller-manager-8d984cc4d-ls78t\" (UID: \"d23bd511-3627-436d-bea3-abc434a7ecc7\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.272743 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxtml\" (UniqueName: \"kubernetes.io/projected/7652cece-84f1-49ba-b99b-8e40047e7822-kube-api-access-kxtml\") pod \"manila-operator-controller-manager-65d89cfd9f-cfcl2\" (UID: \"7652cece-84f1-49ba-b99b-8e40047e7822\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.272776 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cwk4\" (UniqueName: \"kubernetes.io/projected/8031228c-d653-49ea-aa71-90709d299152-kube-api-access-4cwk4\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-s8527\" (UID: \"8031228c-d653-49ea-aa71-90709d299152\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.272812 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9hv\" (UniqueName: \"kubernetes.io/projected/8bba42dc-f728-4066-a83c-632c7dfd4502-kube-api-access-wg9hv\") pod \"ironic-operator-controller-manager-649675d675-h6gp4\" (UID: \"8bba42dc-f728-4066-a83c-632c7dfd4502\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.272852 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hncx2\" (UniqueName: \"kubernetes.io/projected/23c2345a-b95d-40e2-9d36-4affd79498e8-kube-api-access-hncx2\") pod \"nova-operator-controller-manager-7c7fc454ff-6ws6z\" (UID: \"23c2345a-b95d-40e2-9d36-4affd79498e8\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.272910 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mflw8\" (UniqueName: \"kubernetes.io/projected/c71106dc-615a-4668-8485-8140171bd46e-kube-api-access-mflw8\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d\" (UID: \"c71106dc-615a-4668-8485-8140171bd46e\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.272971 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf2m6\" (UniqueName: \"kubernetes.io/projected/7e7d79cb-0957-4ef5-9b20-f82fd5d288d8-kube-api-access-gf2m6\") pod \"octavia-operator-controller-manager-7468f855d8-lvcfv\" (UID: \"7e7d79cb-0957-4ef5-9b20-f82fd5d288d8\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.273656 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.277578 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.279822 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.288932 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cwk4\" (UniqueName: \"kubernetes.io/projected/8031228c-d653-49ea-aa71-90709d299152-kube-api-access-4cwk4\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-s8527\" (UID: \"8031228c-d653-49ea-aa71-90709d299152\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.296202 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxtml\" (UniqueName: \"kubernetes.io/projected/7652cece-84f1-49ba-b99b-8e40047e7822-kube-api-access-kxtml\") pod \"manila-operator-controller-manager-65d89cfd9f-cfcl2\" (UID: \"7652cece-84f1-49ba-b99b-8e40047e7822\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.307559 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.308038 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9hv\" (UniqueName: \"kubernetes.io/projected/8bba42dc-f728-4066-a83c-632c7dfd4502-kube-api-access-wg9hv\") pod \"ironic-operator-controller-manager-649675d675-h6gp4\" (UID: \"8bba42dc-f728-4066-a83c-632c7dfd4502\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.308743 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.310686 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nkpkh" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.317600 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.323211 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.323677 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.342271 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.350712 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.352152 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-dmbf9" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.352504 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.372926 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.373766 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcxgz\" (UniqueName: \"kubernetes.io/projected/1988af8e-3dcc-41b2-a044-03af6e6bc040-kube-api-access-dcxgz\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k\" (UID: \"1988af8e-3dcc-41b2-a044-03af6e6bc040\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.373824 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hncx2\" (UniqueName: \"kubernetes.io/projected/23c2345a-b95d-40e2-9d36-4affd79498e8-kube-api-access-hncx2\") pod \"nova-operator-controller-manager-7c7fc454ff-6ws6z\" (UID: \"23c2345a-b95d-40e2-9d36-4affd79498e8\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.373851 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k\" (UID: \"1988af8e-3dcc-41b2-a044-03af6e6bc040\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.373869 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvxq\" (UniqueName: \"kubernetes.io/projected/94df4eb7-c559-4ce8-9deb-5da3a04bebb7-kube-api-access-cbvxq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-7jqz4\" (UID: \"94df4eb7-c559-4ce8-9deb-5da3a04bebb7\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.373902 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mflw8\" (UniqueName: \"kubernetes.io/projected/c71106dc-615a-4668-8485-8140171bd46e-kube-api-access-mflw8\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d\" (UID: \"c71106dc-615a-4668-8485-8140171bd46e\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.373940 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf2m6\" (UniqueName: \"kubernetes.io/projected/7e7d79cb-0957-4ef5-9b20-f82fd5d288d8-kube-api-access-gf2m6\") pod \"octavia-operator-controller-manager-7468f855d8-lvcfv\" (UID: \"7e7d79cb-0957-4ef5-9b20-f82fd5d288d8\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.373986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcwjf\" (UniqueName: \"kubernetes.io/projected/32f9f85f-cb86-41f3-88c1-d891f5e67608-kube-api-access-vcwjf\") pod \"placement-operator-controller-manager-54689d9f88-q7kbv\" (UID: \"32f9f85f-cb86-41f3-88c1-d891f5e67608\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.374009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9fvq\" (UniqueName: \"kubernetes.io/projected/d23bd511-3627-436d-bea3-abc434a7ecc7-kube-api-access-s9fvq\") pod \"neutron-operator-controller-manager-8d984cc4d-ls78t\" (UID: \"d23bd511-3627-436d-bea3-abc434a7ecc7\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.394341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf2m6\" (UniqueName: \"kubernetes.io/projected/7e7d79cb-0957-4ef5-9b20-f82fd5d288d8-kube-api-access-gf2m6\") pod \"octavia-operator-controller-manager-7468f855d8-lvcfv\" (UID: \"7e7d79cb-0957-4ef5-9b20-f82fd5d288d8\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.396041 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.396530 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mflw8\" (UniqueName: \"kubernetes.io/projected/c71106dc-615a-4668-8485-8140171bd46e-kube-api-access-mflw8\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d\" (UID: \"c71106dc-615a-4668-8485-8140171bd46e\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.397151 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.398708 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9fvq\" (UniqueName: \"kubernetes.io/projected/d23bd511-3627-436d-bea3-abc434a7ecc7-kube-api-access-s9fvq\") pod \"neutron-operator-controller-manager-8d984cc4d-ls78t\" (UID: \"d23bd511-3627-436d-bea3-abc434a7ecc7\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.401305 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-zjx8v" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.405747 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hncx2\" (UniqueName: \"kubernetes.io/projected/23c2345a-b95d-40e2-9d36-4affd79498e8-kube-api-access-hncx2\") pod \"nova-operator-controller-manager-7c7fc454ff-6ws6z\" (UID: \"23c2345a-b95d-40e2-9d36-4affd79498e8\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.414892 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.439036 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.460798 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.469491 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.471551 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.474863 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9vk8\" (UniqueName: \"kubernetes.io/projected/6a4055ea-2ed1-4de3-a975-850404b8d746-kube-api-access-z9vk8\") pod \"swift-operator-controller-manager-6859f9b676-5gfhv\" (UID: \"6a4055ea-2ed1-4de3-a975-850404b8d746\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.474907 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcxgz\" (UniqueName: \"kubernetes.io/projected/1988af8e-3dcc-41b2-a044-03af6e6bc040-kube-api-access-dcxgz\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k\" (UID: \"1988af8e-3dcc-41b2-a044-03af6e6bc040\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.474927 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tct66\" (UniqueName: \"kubernetes.io/projected/304bfead-26e9-4a95-9c1d-8659de9b0546-kube-api-access-tct66\") pod \"telemetry-operator-controller-manager-5d4d74dd89-wht6n\" (UID: \"304bfead-26e9-4a95-9c1d-8659de9b0546\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.474975 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k\" (UID: \"1988af8e-3dcc-41b2-a044-03af6e6bc040\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.474991 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbvxq\" (UniqueName: \"kubernetes.io/projected/94df4eb7-c559-4ce8-9deb-5da3a04bebb7-kube-api-access-cbvxq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-7jqz4\" (UID: \"94df4eb7-c559-4ce8-9deb-5da3a04bebb7\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.475047 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcwjf\" (UniqueName: \"kubernetes.io/projected/32f9f85f-cb86-41f3-88c1-d891f5e67608-kube-api-access-vcwjf\") pod \"placement-operator-controller-manager-54689d9f88-q7kbv\" (UID: \"32f9f85f-cb86-41f3-88c1-d891f5e67608\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" Oct 06 12:23:19 crc kubenswrapper[4892]: E1006 12:23:19.475571 4892 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 12:23:19 crc kubenswrapper[4892]: E1006 12:23:19.477198 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert podName:1988af8e-3dcc-41b2-a044-03af6e6bc040 nodeName:}" failed. No retries permitted until 2025-10-06 12:23:19.97559431 +0000 UTC m=+886.525300075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" (UID: "1988af8e-3dcc-41b2-a044-03af6e6bc040") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.479365 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.480512 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-l25mp" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.502685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbvxq\" (UniqueName: \"kubernetes.io/projected/94df4eb7-c559-4ce8-9deb-5da3a04bebb7-kube-api-access-cbvxq\") pod \"ovn-operator-controller-manager-6d8b6f9b9-7jqz4\" (UID: \"94df4eb7-c559-4ce8-9deb-5da3a04bebb7\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.506883 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcwjf\" (UniqueName: \"kubernetes.io/projected/32f9f85f-cb86-41f3-88c1-d891f5e67608-kube-api-access-vcwjf\") pod \"placement-operator-controller-manager-54689d9f88-q7kbv\" (UID: \"32f9f85f-cb86-41f3-88c1-d891f5e67608\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.509853 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcxgz\" (UniqueName: \"kubernetes.io/projected/1988af8e-3dcc-41b2-a044-03af6e6bc040-kube-api-access-dcxgz\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k\" (UID: \"1988af8e-3dcc-41b2-a044-03af6e6bc040\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.531406 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.532843 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.544659 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-7gjcs" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.576155 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9vk8\" (UniqueName: \"kubernetes.io/projected/6a4055ea-2ed1-4de3-a975-850404b8d746-kube-api-access-z9vk8\") pod \"swift-operator-controller-manager-6859f9b676-5gfhv\" (UID: \"6a4055ea-2ed1-4de3-a975-850404b8d746\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.576292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tct66\" (UniqueName: \"kubernetes.io/projected/304bfead-26e9-4a95-9c1d-8659de9b0546-kube-api-access-tct66\") pod \"telemetry-operator-controller-manager-5d4d74dd89-wht6n\" (UID: \"304bfead-26e9-4a95-9c1d-8659de9b0546\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.576399 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8h9w\" (UniqueName: \"kubernetes.io/projected/e81e38e4-cf8a-4a93-8753-335c6c1aca6c-kube-api-access-c8h9w\") pod \"test-operator-controller-manager-5cd5cb47d7-z6qpz\" (UID: \"e81e38e4-cf8a-4a93-8753-335c6c1aca6c\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.578191 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.582990 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.597315 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9vk8\" (UniqueName: \"kubernetes.io/projected/6a4055ea-2ed1-4de3-a975-850404b8d746-kube-api-access-z9vk8\") pod \"swift-operator-controller-manager-6859f9b676-5gfhv\" (UID: \"6a4055ea-2ed1-4de3-a975-850404b8d746\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.597694 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.608676 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.611387 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tct66\" (UniqueName: \"kubernetes.io/projected/304bfead-26e9-4a95-9c1d-8659de9b0546-kube-api-access-tct66\") pod \"telemetry-operator-controller-manager-5d4d74dd89-wht6n\" (UID: \"304bfead-26e9-4a95-9c1d-8659de9b0546\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.622486 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.623808 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.635926 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-b77k5" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.636132 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.637859 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.638228 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.652016 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.669547 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.670886 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.673925 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.676076 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ktjmk" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.676883 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.678083 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.706082 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.707186 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8h9w\" (UniqueName: \"kubernetes.io/projected/e81e38e4-cf8a-4a93-8753-335c6c1aca6c-kube-api-access-c8h9w\") pod \"test-operator-controller-manager-5cd5cb47d7-z6qpz\" (UID: \"e81e38e4-cf8a-4a93-8753-335c6c1aca6c\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.708868 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nkzs\" (UniqueName: \"kubernetes.io/projected/41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a-kube-api-access-2nkzs\") pod \"watcher-operator-controller-manager-5966748665-mwjwq\" (UID: \"41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a\") " pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.708974 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c85c0be-894b-4469-820c-35cac2b32905-cert\") pod \"infra-operator-controller-manager-658588b8c9-bdx4w\" (UID: \"5c85c0be-894b-4469-820c-35cac2b32905\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.712143 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5c85c0be-894b-4469-820c-35cac2b32905-cert\") pod \"infra-operator-controller-manager-658588b8c9-bdx4w\" (UID: \"5c85c0be-894b-4469-820c-35cac2b32905\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.714673 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.748753 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.779352 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8h9w\" (UniqueName: \"kubernetes.io/projected/e81e38e4-cf8a-4a93-8753-335c6c1aca6c-kube-api-access-c8h9w\") pod \"test-operator-controller-manager-5cd5cb47d7-z6qpz\" (UID: \"e81e38e4-cf8a-4a93-8753-335c6c1aca6c\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.783113 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.790598 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.800266 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9"] Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.810738 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd70d284-2387-44ef-ad9c-eb725a2a283d-cert\") pod \"openstack-operator-controller-manager-5c6b9976b-dv6c2\" (UID: \"dd70d284-2387-44ef-ad9c-eb725a2a283d\") " pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.810801 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vm9f\" (UniqueName: \"kubernetes.io/projected/2f1e9464-e868-4fb8-baee-7832454f4cd5-kube-api-access-8vm9f\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk\" (UID: \"2f1e9464-e868-4fb8-baee-7832454f4cd5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.810846 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctnpb\" (UniqueName: \"kubernetes.io/projected/dd70d284-2387-44ef-ad9c-eb725a2a283d-kube-api-access-ctnpb\") pod \"openstack-operator-controller-manager-5c6b9976b-dv6c2\" (UID: \"dd70d284-2387-44ef-ad9c-eb725a2a283d\") " pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.810870 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nkzs\" (UniqueName: \"kubernetes.io/projected/41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a-kube-api-access-2nkzs\") pod \"watcher-operator-controller-manager-5966748665-mwjwq\" (UID: \"41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a\") " pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" Oct 06 12:23:19 crc kubenswrapper[4892]: W1006 12:23:19.821350 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod718cc1ac_8554_450f_bca5_5449909339dd.slice/crio-79de50190c4e4b748aed9c8b593818370c87e18c52b08d225c930d963bb899d7 WatchSource:0}: Error finding container 79de50190c4e4b748aed9c8b593818370c87e18c52b08d225c930d963bb899d7: Status 404 returned error can't find the container with id 79de50190c4e4b748aed9c8b593818370c87e18c52b08d225c930d963bb899d7 Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.826535 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nkzs\" (UniqueName: \"kubernetes.io/projected/41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a-kube-api-access-2nkzs\") pod \"watcher-operator-controller-manager-5966748665-mwjwq\" (UID: \"41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a\") " pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.831334 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.887807 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.911948 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd70d284-2387-44ef-ad9c-eb725a2a283d-cert\") pod \"openstack-operator-controller-manager-5c6b9976b-dv6c2\" (UID: \"dd70d284-2387-44ef-ad9c-eb725a2a283d\") " pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.912000 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vm9f\" (UniqueName: \"kubernetes.io/projected/2f1e9464-e868-4fb8-baee-7832454f4cd5-kube-api-access-8vm9f\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk\" (UID: \"2f1e9464-e868-4fb8-baee-7832454f4cd5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.912040 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctnpb\" (UniqueName: \"kubernetes.io/projected/dd70d284-2387-44ef-ad9c-eb725a2a283d-kube-api-access-ctnpb\") pod \"openstack-operator-controller-manager-5c6b9976b-dv6c2\" (UID: \"dd70d284-2387-44ef-ad9c-eb725a2a283d\") " pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.913662 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.917224 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dd70d284-2387-44ef-ad9c-eb725a2a283d-cert\") pod \"openstack-operator-controller-manager-5c6b9976b-dv6c2\" (UID: \"dd70d284-2387-44ef-ad9c-eb725a2a283d\") " pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.929470 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctnpb\" (UniqueName: \"kubernetes.io/projected/dd70d284-2387-44ef-ad9c-eb725a2a283d-kube-api-access-ctnpb\") pod \"openstack-operator-controller-manager-5c6b9976b-dv6c2\" (UID: \"dd70d284-2387-44ef-ad9c-eb725a2a283d\") " pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.929841 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vm9f\" (UniqueName: \"kubernetes.io/projected/2f1e9464-e868-4fb8-baee-7832454f4cd5-kube-api-access-8vm9f\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk\" (UID: \"2f1e9464-e868-4fb8-baee-7832454f4cd5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk" Oct 06 12:23:19 crc kubenswrapper[4892]: I1006 12:23:19.972870 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.013043 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k\" (UID: \"1988af8e-3dcc-41b2-a044-03af6e6bc040\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.013240 4892 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.013287 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert podName:1988af8e-3dcc-41b2-a044-03af6e6bc040 nodeName:}" failed. No retries permitted until 2025-10-06 12:23:21.01327291 +0000 UTC m=+887.562978675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" (UID: "1988af8e-3dcc-41b2-a044-03af6e6bc040") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.050889 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527"] Oct 06 12:23:20 crc kubenswrapper[4892]: W1006 12:23:20.077353 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8031228c_d653_49ea_aa71_90709d299152.slice/crio-a76d4aaad00ef5a19114dfae91425b97a3a7def9a44a7d2021809db1b694472b WatchSource:0}: Error finding container a76d4aaad00ef5a19114dfae91425b97a3a7def9a44a7d2021809db1b694472b: Status 404 returned error can't find the container with id a76d4aaad00ef5a19114dfae91425b97a3a7def9a44a7d2021809db1b694472b Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.124497 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk" Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.243702 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw"] Oct 06 12:23:20 crc kubenswrapper[4892]: W1006 12:23:20.249272 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b478830_010a_408c_9eb4_0eaa51f75c31.slice/crio-1f4b3b998d9a520941e6733d883eb19ec91407618751fa48e6691567f24a1ce5 WatchSource:0}: Error finding container 1f4b3b998d9a520941e6733d883eb19ec91407618751fa48e6691567f24a1ce5: Status 404 returned error can't find the container with id 1f4b3b998d9a520941e6733d883eb19ec91407618751fa48e6691567f24a1ce5 Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.315338 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.323199 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.325155 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.492408 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.501541 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.533395 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.544126 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.574100 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4"] Oct 06 12:23:20 crc kubenswrapper[4892]: W1006 12:23:20.637526 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94df4eb7_c559_4ce8_9deb_5da3a04bebb7.slice/crio-2f05fbea92e73b20c3ee46663127fc14c2cdbaf1aeb91b1678a0a5a5dfa65772 WatchSource:0}: Error finding container 2f05fbea92e73b20c3ee46663127fc14c2cdbaf1aeb91b1678a0a5a5dfa65772: Status 404 returned error can't find the container with id 2f05fbea92e73b20c3ee46663127fc14c2cdbaf1aeb91b1678a0a5a5dfa65772 Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.646879 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.658565 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w"] Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.663352 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cbvxq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-6d8b6f9b9-7jqz4_openstack-operators(94df4eb7-c559-4ce8-9deb-5da3a04bebb7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.680907 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.687913 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" event={"ID":"94df4eb7-c559-4ce8-9deb-5da3a04bebb7","Type":"ContainerStarted","Data":"2f05fbea92e73b20c3ee46663127fc14c2cdbaf1aeb91b1678a0a5a5dfa65772"} Oct 06 12:23:20 crc kubenswrapper[4892]: W1006 12:23:20.693002 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a4055ea_2ed1_4de3_a975_850404b8d746.slice/crio-cdc2f3c71d66f79b90229836db68b565c9b3f731a10f64d9f1fc878b50e60ba6 WatchSource:0}: Error finding container cdc2f3c71d66f79b90229836db68b565c9b3f731a10f64d9f1fc878b50e60ba6: Status 404 returned error can't find the container with id cdc2f3c71d66f79b90229836db68b565c9b3f731a10f64d9f1fc878b50e60ba6 Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.693223 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" event={"ID":"8bba42dc-f728-4066-a83c-632c7dfd4502","Type":"ContainerStarted","Data":"dd896b143097aab1055cb8e966f675cf28627899d45682da18875a9941306df8"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.695973 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv"] Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.700090 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z9vk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-5gfhv_openstack-operators(6a4055ea-2ed1-4de3-a975-850404b8d746): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.705526 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g55wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-bdx4w_openstack-operators(5c85c0be-894b-4469-820c-35cac2b32905): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.706847 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" event={"ID":"e81e38e4-cf8a-4a93-8753-335c6c1aca6c","Type":"ContainerStarted","Data":"72d3007a632ab88f60e249bb5817c8b28862080f538de2849b23f5ad7e6b9880"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.710819 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" event={"ID":"ed10a250-1c0b-4fc4-9906-6e01dba78a1e","Type":"ContainerStarted","Data":"ec08263bb2d4c3e95dc117b333c508bcfbb2645d32aa81d81ab2ae732440a111"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.713353 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" event={"ID":"3b478830-010a-408c-9eb4-0eaa51f75c31","Type":"ContainerStarted","Data":"1f4b3b998d9a520941e6733d883eb19ec91407618751fa48e6691567f24a1ce5"} Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.731304 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gf2m6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7468f855d8-lvcfv_openstack-operators(7e7d79cb-0957-4ef5-9b20-f82fd5d288d8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.735341 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" event={"ID":"c71106dc-615a-4668-8485-8140171bd46e","Type":"ContainerStarted","Data":"f7ad35896ce6ff12aa285ba29f557f3027a4acc6c39e82e70652f955c63dcead"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.737406 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" event={"ID":"773fe79a-1318-46ee-87bc-99786396705c","Type":"ContainerStarted","Data":"3ec333a1a617343f38a74deb7b9a1d219b21632dcb46b670fb458c2208e2c4f6"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.741965 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" event={"ID":"32f9f85f-cb86-41f3-88c1-d891f5e67608","Type":"ContainerStarted","Data":"8da04be0c9ec3ba3be1b25dfd8504a65586f1457df9d3a32177f869400edb887"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.742914 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" event={"ID":"7652cece-84f1-49ba-b99b-8e40047e7822","Type":"ContainerStarted","Data":"e4d7517cfb7b06ea08d2491e8387c4a9557decbb0f5873a52ebef377b87c6ed8"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.743939 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" event={"ID":"d23bd511-3627-436d-bea3-abc434a7ecc7","Type":"ContainerStarted","Data":"eb7bcfe505a7339382df9a6db93c06e8b0d1247b3ffcb337ba3a4d2d33748edf"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.744817 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" event={"ID":"8031228c-d653-49ea-aa71-90709d299152","Type":"ContainerStarted","Data":"a76d4aaad00ef5a19114dfae91425b97a3a7def9a44a7d2021809db1b694472b"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.749175 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" event={"ID":"4c2bced1-04b0-4525-b1d3-c3adc3669b68","Type":"ContainerStarted","Data":"e6f7cbc973ef2f106d228585f2e4c17e44484a5bf3a4ad4bb31fa06c05fe096c"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.753067 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" event={"ID":"718cc1ac-8554-450f-bca5-5449909339dd","Type":"ContainerStarted","Data":"79de50190c4e4b748aed9c8b593818370c87e18c52b08d225c930d963bb899d7"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.755274 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" event={"ID":"23c2345a-b95d-40e2-9d36-4affd79498e8","Type":"ContainerStarted","Data":"90bc00251ae48e0b9eaae65b47a5c3481ada9eb0d72a2f936d9ef6f09380ddd7"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.756160 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" event={"ID":"1de023aa-81a7-4510-a6a3-93010ca572be","Type":"ContainerStarted","Data":"bc35d0c5a967f63f2263fb80803937a5e37a0fbd3e8c80a93f1da6b3e7d30947"} Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.761436 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n"] Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.765929 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2"] Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.786076 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tct66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5d4d74dd89-wht6n_openstack-operators(304bfead-26e9-4a95-9c1d-8659de9b0546): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 06 12:23:20 crc kubenswrapper[4892]: I1006 12:23:20.879707 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk"] Oct 06 12:23:20 crc kubenswrapper[4892]: W1006 12:23:20.892201 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1e9464_e868_4fb8_baee_7832454f4cd5.slice/crio-f42e0ad82fad8b8b572c351b40f22931ba7680fbdfe91c59f3217b120b8eb847 WatchSource:0}: Error finding container f42e0ad82fad8b8b572c351b40f22931ba7680fbdfe91c59f3217b120b8eb847: Status 404 returned error can't find the container with id f42e0ad82fad8b8b572c351b40f22931ba7680fbdfe91c59f3217b120b8eb847 Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.917208 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" podUID="6a4055ea-2ed1-4de3-a975-850404b8d746" Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.946572 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" podUID="94df4eb7-c559-4ce8-9deb-5da3a04bebb7" Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.948498 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" podUID="5c85c0be-894b-4469-820c-35cac2b32905" Oct 06 12:23:20 crc kubenswrapper[4892]: E1006 12:23:20.967641 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" podUID="7e7d79cb-0957-4ef5-9b20-f82fd5d288d8" Oct 06 12:23:21 crc kubenswrapper[4892]: E1006 12:23:21.048792 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" podUID="304bfead-26e9-4a95-9c1d-8659de9b0546" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.049400 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k\" (UID: \"1988af8e-3dcc-41b2-a044-03af6e6bc040\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.058061 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1988af8e-3dcc-41b2-a044-03af6e6bc040-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k\" (UID: \"1988af8e-3dcc-41b2-a044-03af6e6bc040\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.128726 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.485056 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k"] Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.790873 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" event={"ID":"94df4eb7-c559-4ce8-9deb-5da3a04bebb7","Type":"ContainerStarted","Data":"92055d4c99124defbe6004f2e3569da98a9dd0a904212023218c9596fb67cc43"} Oct 06 12:23:21 crc kubenswrapper[4892]: E1006 12:23:21.791939 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" podUID="94df4eb7-c559-4ce8-9deb-5da3a04bebb7" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.799340 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" event={"ID":"5c85c0be-894b-4469-820c-35cac2b32905","Type":"ContainerStarted","Data":"ae035e4900928243813513b9a7214209fe1468cd37feb2a63af882155bc2e245"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.799379 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" event={"ID":"5c85c0be-894b-4469-820c-35cac2b32905","Type":"ContainerStarted","Data":"01084c535ccc52a9233f4397eba2e42a2b8f0c5f362cf631a0a6890ed79b0a21"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.803341 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" event={"ID":"304bfead-26e9-4a95-9c1d-8659de9b0546","Type":"ContainerStarted","Data":"43f3b248c46dd7fd672f8cc683797c4340e46a70fb6f18956f434fce748e11f7"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.803456 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" event={"ID":"304bfead-26e9-4a95-9c1d-8659de9b0546","Type":"ContainerStarted","Data":"21dc7191d81984cf932a6e6c24b28868de108726bdab59a400788ce0b8033550"} Oct 06 12:23:21 crc kubenswrapper[4892]: E1006 12:23:21.804956 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" podUID="5c85c0be-894b-4469-820c-35cac2b32905" Oct 06 12:23:21 crc kubenswrapper[4892]: E1006 12:23:21.805287 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" podUID="304bfead-26e9-4a95-9c1d-8659de9b0546" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.817906 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk" event={"ID":"2f1e9464-e868-4fb8-baee-7832454f4cd5","Type":"ContainerStarted","Data":"f42e0ad82fad8b8b572c351b40f22931ba7680fbdfe91c59f3217b120b8eb847"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.836920 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" event={"ID":"dd70d284-2387-44ef-ad9c-eb725a2a283d","Type":"ContainerStarted","Data":"85fa693601113787b684b0715d5b7cd13432a419be166036b58214e29b4a16a0"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.836978 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" event={"ID":"dd70d284-2387-44ef-ad9c-eb725a2a283d","Type":"ContainerStarted","Data":"f54110017ca161f88968d88bcbc27085a28016c2b498caa5a318eaf3d13e6771"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.836988 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" event={"ID":"dd70d284-2387-44ef-ad9c-eb725a2a283d","Type":"ContainerStarted","Data":"8e89e11767facf4ca71935a4ff74303f639f511732e30571ec809cf7c55e9e6f"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.837178 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.844478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" event={"ID":"6a4055ea-2ed1-4de3-a975-850404b8d746","Type":"ContainerStarted","Data":"7e372a15413face12e402589b6cd29038a44da874e1ac01490e651bb673d2aef"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.844516 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" event={"ID":"6a4055ea-2ed1-4de3-a975-850404b8d746","Type":"ContainerStarted","Data":"cdc2f3c71d66f79b90229836db68b565c9b3f731a10f64d9f1fc878b50e60ba6"} Oct 06 12:23:21 crc kubenswrapper[4892]: E1006 12:23:21.845838 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" podUID="6a4055ea-2ed1-4de3-a975-850404b8d746" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.848051 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" event={"ID":"7e7d79cb-0957-4ef5-9b20-f82fd5d288d8","Type":"ContainerStarted","Data":"1bd4838aa3b859adc55af3c2f25f2fa7e6cd60b4817beb8d00659ab299b0bbfa"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.848069 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" event={"ID":"7e7d79cb-0957-4ef5-9b20-f82fd5d288d8","Type":"ContainerStarted","Data":"5171efc78f9dc606cdf9cfd1c083a00c7b9d269981dbc5c89c1b90314f66e1de"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.851009 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" event={"ID":"41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a","Type":"ContainerStarted","Data":"faf8ffd3a4752d12e28e4ef1ef94ad1a440cc609c38ff602d68d1d64c8205e44"} Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.852695 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" event={"ID":"1988af8e-3dcc-41b2-a044-03af6e6bc040","Type":"ContainerStarted","Data":"7a52a4997fdb06418fdb95ce971366a2efaf3c91b31c95275db2a59f8caabfdc"} Oct 06 12:23:21 crc kubenswrapper[4892]: E1006 12:23:21.853536 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" podUID="7e7d79cb-0957-4ef5-9b20-f82fd5d288d8" Oct 06 12:23:21 crc kubenswrapper[4892]: I1006 12:23:21.862249 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" podStartSLOduration=2.862237062 podStartE2EDuration="2.862237062s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:23:21.861051249 +0000 UTC m=+888.410757014" watchObservedRunningTime="2025-10-06 12:23:21.862237062 +0000 UTC m=+888.411942827" Oct 06 12:23:22 crc kubenswrapper[4892]: E1006 12:23:22.863120 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" podUID="304bfead-26e9-4a95-9c1d-8659de9b0546" Oct 06 12:23:22 crc kubenswrapper[4892]: E1006 12:23:22.864204 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f37e29d1f621c23c0d77b09076006d1e8002a77c2ff3d9b8921f893221cb1d09\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" podUID="94df4eb7-c559-4ce8-9deb-5da3a04bebb7" Oct 06 12:23:22 crc kubenswrapper[4892]: E1006 12:23:22.864268 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:da5c3078d80878d66c616e6f8a0bb909f95d971cde2c612f96fded064113e182\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" podUID="7e7d79cb-0957-4ef5-9b20-f82fd5d288d8" Oct 06 12:23:22 crc kubenswrapper[4892]: E1006 12:23:22.864295 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" podUID="6a4055ea-2ed1-4de3-a975-850404b8d746" Oct 06 12:23:22 crc kubenswrapper[4892]: E1006 12:23:22.864399 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" podUID="5c85c0be-894b-4469-820c-35cac2b32905" Oct 06 12:23:29 crc kubenswrapper[4892]: I1006 12:23:29.979945 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5c6b9976b-dv6c2" Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.938110 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" event={"ID":"8031228c-d653-49ea-aa71-90709d299152","Type":"ContainerStarted","Data":"5507e291addb5b288ae1a1e1e496165eba3d21a07465d422f05f3e52d3e23491"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.960136 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" event={"ID":"3b478830-010a-408c-9eb4-0eaa51f75c31","Type":"ContainerStarted","Data":"df8dc6aadf874da29101ca97484653eb305c2f2080a6387272ddaf36036abd0a"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.960182 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" event={"ID":"3b478830-010a-408c-9eb4-0eaa51f75c31","Type":"ContainerStarted","Data":"02414b19654283124693491d0321b82c9008c57a4b000a4812cfa31efdb10623"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.960223 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.967006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" event={"ID":"c71106dc-615a-4668-8485-8140171bd46e","Type":"ContainerStarted","Data":"e9dd9869f10a215146673d9e87a88073482c51cc1ef7a50092a16f564772f36b"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.971365 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" event={"ID":"718cc1ac-8554-450f-bca5-5449909339dd","Type":"ContainerStarted","Data":"c82fefb2d1cf64789a59b37b34a6c680fed50a92c7d1fabec3a55c41ca71c7ce"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.981227 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" event={"ID":"7652cece-84f1-49ba-b99b-8e40047e7822","Type":"ContainerStarted","Data":"acea1ddcafc82e8224e3b58d73432610d88e142b5af3f9dbac394cae3d5d9c42"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.981271 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" event={"ID":"7652cece-84f1-49ba-b99b-8e40047e7822","Type":"ContainerStarted","Data":"bf49ad9889be5d50f8cdb2bf3f6e726c13ec6ed905d39207e201621b03ebc21d"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.981331 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.982754 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" podStartSLOduration=3.811797835 podStartE2EDuration="13.982743924s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.255863404 +0000 UTC m=+886.805569169" lastFinishedPulling="2025-10-06 12:23:30.426809483 +0000 UTC m=+896.976515258" observedRunningTime="2025-10-06 12:23:31.980739018 +0000 UTC m=+898.530444793" watchObservedRunningTime="2025-10-06 12:23:31.982743924 +0000 UTC m=+898.532449689" Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.983055 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk" event={"ID":"2f1e9464-e868-4fb8-baee-7832454f4cd5","Type":"ContainerStarted","Data":"ea1829a0c307a3ff7265c87d0497c52e317cda7bf68fae1b1aac7daf0c6dcc44"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.994773 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" event={"ID":"23c2345a-b95d-40e2-9d36-4affd79498e8","Type":"ContainerStarted","Data":"ebae4bee1ce459f7ba16d2a9bf2ac7acccd4c24764dfb4e7d4b871daec02ca25"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.996528 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" event={"ID":"41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a","Type":"ContainerStarted","Data":"0e2b39cd5bd9b4204d4ec1ccd157d1f84170b30200601470fc4553bf552ea82e"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.997768 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" event={"ID":"4c2bced1-04b0-4525-b1d3-c3adc3669b68","Type":"ContainerStarted","Data":"f653b059f4ef6b59a72a63bcfa3a36a2a6276679f1e8ecdd674ddfdb54e5ad60"} Oct 06 12:23:31 crc kubenswrapper[4892]: I1006 12:23:31.998342 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.003101 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" podStartSLOduration=3.851136701 podStartE2EDuration="14.003085461s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.331479222 +0000 UTC m=+886.881184987" lastFinishedPulling="2025-10-06 12:23:30.483427942 +0000 UTC m=+897.033133747" observedRunningTime="2025-10-06 12:23:31.999487291 +0000 UTC m=+898.549193046" watchObservedRunningTime="2025-10-06 12:23:32.003085461 +0000 UTC m=+898.552791226" Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.003670 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" event={"ID":"d23bd511-3627-436d-bea3-abc434a7ecc7","Type":"ContainerStarted","Data":"f9b996ec35d11846976e2f914b09b03504ded6623d8a779950600f9862fbb2ca"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.015345 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" event={"ID":"1988af8e-3dcc-41b2-a044-03af6e6bc040","Type":"ContainerStarted","Data":"2333bd00e3fef361bf4d1f27f7dd3622254cc467c77e1cd7f3cdd5a86c954362"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.029131 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" event={"ID":"e81e38e4-cf8a-4a93-8753-335c6c1aca6c","Type":"ContainerStarted","Data":"8c361956fd98ef01ae5e2a821a3dabd40e8b054c2af9d6454bf6643f10a3e023"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.031341 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" podStartSLOduration=3.4095351689999998 podStartE2EDuration="14.031313118s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:19.873417281 +0000 UTC m=+886.423123046" lastFinishedPulling="2025-10-06 12:23:30.49519522 +0000 UTC m=+897.044900995" observedRunningTime="2025-10-06 12:23:32.029539479 +0000 UTC m=+898.579245244" watchObservedRunningTime="2025-10-06 12:23:32.031313118 +0000 UTC m=+898.581018883" Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.047775 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" event={"ID":"ed10a250-1c0b-4fc4-9906-6e01dba78a1e","Type":"ContainerStarted","Data":"48731ca3b6775f2adc6cf294a741888b63a11314c3eee9bf52b9a2920d4943e4"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.048007 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk" podStartSLOduration=3.45256232 podStartE2EDuration="13.047997363s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.89483564 +0000 UTC m=+887.444541405" lastFinishedPulling="2025-10-06 12:23:30.490270673 +0000 UTC m=+897.039976448" observedRunningTime="2025-10-06 12:23:32.043024175 +0000 UTC m=+898.592729930" watchObservedRunningTime="2025-10-06 12:23:32.047997363 +0000 UTC m=+898.597703128" Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.060313 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" event={"ID":"773fe79a-1318-46ee-87bc-99786396705c","Type":"ContainerStarted","Data":"cd2fd4ba190870b72f11e8057faa7b65363519feb48dfb7fb5777d12ae6fb0f3"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.071897 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" event={"ID":"8bba42dc-f728-4066-a83c-632c7dfd4502","Type":"ContainerStarted","Data":"4a79aa2476842de1e76795d2a496c7e32e66f1deef54a3b38aca0dc6b1d388ca"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.086345 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" event={"ID":"32f9f85f-cb86-41f3-88c1-d891f5e67608","Type":"ContainerStarted","Data":"a0d6e02f3a2924a04529f1490cffb04ed1cadc8194a6718063d1854ed5abc240"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.100402 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" event={"ID":"1de023aa-81a7-4510-a6a3-93010ca572be","Type":"ContainerStarted","Data":"b36868d2e9f522da4138bbd8ecfb705fc283bb1cda0fccaa9f9e4f430b941ff2"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.100442 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" event={"ID":"1de023aa-81a7-4510-a6a3-93010ca572be","Type":"ContainerStarted","Data":"5ced97e1ca3f3360dc007062c44d5512e0ca6983d5bbca5e3187e55504b182a6"} Oct 06 12:23:32 crc kubenswrapper[4892]: I1006 12:23:32.101213 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.109898 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" event={"ID":"773fe79a-1318-46ee-87bc-99786396705c","Type":"ContainerStarted","Data":"77ad96d577aa55cc949a8eb7b2505a1f4a2d6bba4e65e19cef35d8fc100eeb6e"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.111116 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.112132 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" event={"ID":"32f9f85f-cb86-41f3-88c1-d891f5e67608","Type":"ContainerStarted","Data":"be119def4dee5bd32bbd11744905c3570118465f17986a7c406481d58f3fdac0"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.112734 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.114517 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" event={"ID":"23c2345a-b95d-40e2-9d36-4affd79498e8","Type":"ContainerStarted","Data":"55b42c6e1035fb18995d6b1fe47495b364cfb309c99e6f1ee7baad1bc50effff"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.115302 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.116309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" event={"ID":"1988af8e-3dcc-41b2-a044-03af6e6bc040","Type":"ContainerStarted","Data":"e4507832540d248dc06a5837c73882cf64cbedf91184330dc20f521afa298c43"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.116499 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.118202 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" event={"ID":"ed10a250-1c0b-4fc4-9906-6e01dba78a1e","Type":"ContainerStarted","Data":"0643851a18848287309e0394fe1c4772527e44878a11bbf32da3239240e477bc"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.119192 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.123351 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" event={"ID":"4c2bced1-04b0-4525-b1d3-c3adc3669b68","Type":"ContainerStarted","Data":"fb97731e20a4eab0c2e2f9fdb67ca321a2401669e3f30f5011563e1063fe5b27"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.125158 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" event={"ID":"718cc1ac-8554-450f-bca5-5449909339dd","Type":"ContainerStarted","Data":"d6cd71775b811872c53d79975293dfc6c207e4641903d09b870b129003867972"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.125794 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.132135 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" event={"ID":"8bba42dc-f728-4066-a83c-632c7dfd4502","Type":"ContainerStarted","Data":"3a5d58e4ea8520fe34954f19316512605aaa073cb3f051d60578619aedae4053"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.132223 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.138190 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" event={"ID":"d23bd511-3627-436d-bea3-abc434a7ecc7","Type":"ContainerStarted","Data":"beff6725d72ea1dfee1e4810d4653276cda22760bfdc0713426c8d38f23c1077"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.139683 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.140588 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" podStartSLOduration=4.397937379 podStartE2EDuration="15.140553986s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:19.676627105 +0000 UTC m=+886.226332870" lastFinishedPulling="2025-10-06 12:23:30.419243672 +0000 UTC m=+896.968949477" observedRunningTime="2025-10-06 12:23:32.121673888 +0000 UTC m=+898.671379653" watchObservedRunningTime="2025-10-06 12:23:33.140553986 +0000 UTC m=+899.690259751" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.142426 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" event={"ID":"41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a","Type":"ContainerStarted","Data":"dc5278118a4082ee3dcf32080fe2af11d8d321ead3ca6972f2a6ea0fe11ad14e"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.143527 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.147825 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" podStartSLOduration=4.503656747 podStartE2EDuration="15.147804918s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:19.843421986 +0000 UTC m=+886.393127751" lastFinishedPulling="2025-10-06 12:23:30.487570137 +0000 UTC m=+897.037275922" observedRunningTime="2025-10-06 12:23:33.13641876 +0000 UTC m=+899.686124525" watchObservedRunningTime="2025-10-06 12:23:33.147804918 +0000 UTC m=+899.697510683" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.148169 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" event={"ID":"e81e38e4-cf8a-4a93-8753-335c6c1aca6c","Type":"ContainerStarted","Data":"f36815cec0a93fd95bd84f920d5e770499d9e3c12a9490a9508f19f71f008f7a"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.148334 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.152443 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" event={"ID":"8031228c-d653-49ea-aa71-90709d299152","Type":"ContainerStarted","Data":"83d604d80fd402ee008d975d019e06ea6b0cfcee7b1d7f2469f69959b7e0b86e"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.152573 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.154507 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" podStartSLOduration=4.270804274 podStartE2EDuration="14.154482934s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.603526767 +0000 UTC m=+887.153232522" lastFinishedPulling="2025-10-06 12:23:30.487205377 +0000 UTC m=+897.036911182" observedRunningTime="2025-10-06 12:23:33.153159237 +0000 UTC m=+899.702865012" watchObservedRunningTime="2025-10-06 12:23:33.154482934 +0000 UTC m=+899.704188699" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.154867 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" event={"ID":"c71106dc-615a-4668-8485-8140171bd46e","Type":"ContainerStarted","Data":"d2b95429799bcc2fc781481ec48ebcade603aa36f626f6699d14ab1f805ab531"} Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.190244 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" podStartSLOduration=5.294048444 podStartE2EDuration="14.190222721s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:21.587476421 +0000 UTC m=+888.137182186" lastFinishedPulling="2025-10-06 12:23:30.483650668 +0000 UTC m=+897.033356463" observedRunningTime="2025-10-06 12:23:33.176868528 +0000 UTC m=+899.726574293" watchObservedRunningTime="2025-10-06 12:23:33.190222721 +0000 UTC m=+899.739928496" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.202197 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" podStartSLOduration=4.6494513699999995 podStartE2EDuration="15.202179024s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:19.87590882 +0000 UTC m=+886.425614585" lastFinishedPulling="2025-10-06 12:23:30.428636464 +0000 UTC m=+896.978342239" observedRunningTime="2025-10-06 12:23:33.197659018 +0000 UTC m=+899.747364783" watchObservedRunningTime="2025-10-06 12:23:33.202179024 +0000 UTC m=+899.751884789" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.217231 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" podStartSLOduration=5.309506394 podStartE2EDuration="15.217213013s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.519070063 +0000 UTC m=+887.068775828" lastFinishedPulling="2025-10-06 12:23:30.426776682 +0000 UTC m=+896.976482447" observedRunningTime="2025-10-06 12:23:33.214758255 +0000 UTC m=+899.764464020" watchObservedRunningTime="2025-10-06 12:23:33.217213013 +0000 UTC m=+899.766918768" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.232863 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" podStartSLOduration=4.141373986 podStartE2EDuration="14.232842079s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.342975233 +0000 UTC m=+886.892680998" lastFinishedPulling="2025-10-06 12:23:30.434443326 +0000 UTC m=+896.984149091" observedRunningTime="2025-10-06 12:23:33.231749598 +0000 UTC m=+899.781455363" watchObservedRunningTime="2025-10-06 12:23:33.232842079 +0000 UTC m=+899.782547844" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.247307 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" podStartSLOduration=4.651375865 podStartE2EDuration="15.247288042s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:19.83246085 +0000 UTC m=+886.382166615" lastFinishedPulling="2025-10-06 12:23:30.428373027 +0000 UTC m=+896.978078792" observedRunningTime="2025-10-06 12:23:33.244616067 +0000 UTC m=+899.794321832" watchObservedRunningTime="2025-10-06 12:23:33.247288042 +0000 UTC m=+899.796993817" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.269433 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" podStartSLOduration=4.446606186 podStartE2EDuration="14.269404158s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.605968386 +0000 UTC m=+887.155674151" lastFinishedPulling="2025-10-06 12:23:30.428766338 +0000 UTC m=+896.978472123" observedRunningTime="2025-10-06 12:23:33.260759177 +0000 UTC m=+899.810464952" watchObservedRunningTime="2025-10-06 12:23:33.269404158 +0000 UTC m=+899.819109953" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.278088 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" podStartSLOduration=5.125599586 podStartE2EDuration="15.27807009s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.331173674 +0000 UTC m=+886.880879439" lastFinishedPulling="2025-10-06 12:23:30.483644168 +0000 UTC m=+897.033349943" observedRunningTime="2025-10-06 12:23:33.274198072 +0000 UTC m=+899.823903867" watchObservedRunningTime="2025-10-06 12:23:33.27807009 +0000 UTC m=+899.827775865" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.291070 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" podStartSLOduration=4.94605011 podStartE2EDuration="15.291052182s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.084159837 +0000 UTC m=+886.633865592" lastFinishedPulling="2025-10-06 12:23:30.429161899 +0000 UTC m=+896.978867664" observedRunningTime="2025-10-06 12:23:33.290368943 +0000 UTC m=+899.840074698" watchObservedRunningTime="2025-10-06 12:23:33.291052182 +0000 UTC m=+899.840757947" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.310960 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" podStartSLOduration=4.440617179 podStartE2EDuration="14.310930116s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.558566884 +0000 UTC m=+887.108272649" lastFinishedPulling="2025-10-06 12:23:30.428879791 +0000 UTC m=+896.978585586" observedRunningTime="2025-10-06 12:23:33.308017815 +0000 UTC m=+899.857723590" watchObservedRunningTime="2025-10-06 12:23:33.310930116 +0000 UTC m=+899.860635881" Oct 06 12:23:33 crc kubenswrapper[4892]: I1006 12:23:33.326445 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" podStartSLOduration=4.495850838 podStartE2EDuration="14.326430238s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.697022234 +0000 UTC m=+887.246727999" lastFinishedPulling="2025-10-06 12:23:30.527601624 +0000 UTC m=+897.077307399" observedRunningTime="2025-10-06 12:23:33.322153829 +0000 UTC m=+899.871859614" watchObservedRunningTime="2025-10-06 12:23:33.326430238 +0000 UTC m=+899.876136003" Oct 06 12:23:34 crc kubenswrapper[4892]: I1006 12:23:34.163159 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.157105 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-b6gs7" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.183569 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-p2ncw" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.210807 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-68ldg" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.256080 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-gtrv6" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.281781 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-s8hw9" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.294814 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-qdxqw" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.347684 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-h6gp4" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.356400 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-s8527" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.441785 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-cfcl2" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.467006 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.583589 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-ls78t" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.601942 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-6ws6z" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.676829 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-q7kbv" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.835746 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-z6qpz" Oct 06 12:23:39 crc kubenswrapper[4892]: I1006 12:23:39.890763 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5966748665-mwjwq" Oct 06 12:23:41 crc kubenswrapper[4892]: I1006 12:23:41.136464 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k" Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.288205 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" event={"ID":"94df4eb7-c559-4ce8-9deb-5da3a04bebb7","Type":"ContainerStarted","Data":"785c7396e10ba7e0cee5b8c7a6e82bbf4c19d0f1cc24b5a40db58603d4196f1c"} Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.289282 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.299289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" event={"ID":"5c85c0be-894b-4469-820c-35cac2b32905","Type":"ContainerStarted","Data":"3919754abd51d395f5b40db4b093aff390507a1389d2c54a816fba100e6a638c"} Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.299791 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.305663 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" event={"ID":"6a4055ea-2ed1-4de3-a975-850404b8d746","Type":"ContainerStarted","Data":"a72568a86fb6495add69d086d7a849e886ab4164739e9ba734cb46227779ff34"} Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.313951 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" podStartSLOduration=2.707006743 podStartE2EDuration="26.313932423s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.66316927 +0000 UTC m=+887.212875035" lastFinishedPulling="2025-10-06 12:23:44.27009495 +0000 UTC m=+910.819800715" observedRunningTime="2025-10-06 12:23:45.305775455 +0000 UTC m=+911.855481220" watchObservedRunningTime="2025-10-06 12:23:45.313932423 +0000 UTC m=+911.863638188" Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.316570 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.329409 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" event={"ID":"304bfead-26e9-4a95-9c1d-8659de9b0546","Type":"ContainerStarted","Data":"0b87bac1269d2927f76ae7ffe64394c79e9e74da8b60e58e86b42624a8c1a0d9"} Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.330385 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.337006 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" podStartSLOduration=2.762867061 podStartE2EDuration="26.336989126s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.699988697 +0000 UTC m=+887.249694462" lastFinishedPulling="2025-10-06 12:23:44.274110752 +0000 UTC m=+910.823816527" observedRunningTime="2025-10-06 12:23:45.332702586 +0000 UTC m=+911.882408351" watchObservedRunningTime="2025-10-06 12:23:45.336989126 +0000 UTC m=+911.886694891" Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.357891 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" podStartSLOduration=3.788006992 podStartE2EDuration="27.357872398s" podCreationTimestamp="2025-10-06 12:23:18 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.705441329 +0000 UTC m=+887.255147104" lastFinishedPulling="2025-10-06 12:23:44.275306745 +0000 UTC m=+910.825012510" observedRunningTime="2025-10-06 12:23:45.350252096 +0000 UTC m=+911.899957861" watchObservedRunningTime="2025-10-06 12:23:45.357872398 +0000 UTC m=+911.907578173" Oct 06 12:23:45 crc kubenswrapper[4892]: I1006 12:23:45.369513 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" podStartSLOduration=2.870711387 podStartE2EDuration="26.369493802s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.780297706 +0000 UTC m=+887.330003471" lastFinishedPulling="2025-10-06 12:23:44.279080131 +0000 UTC m=+910.828785886" observedRunningTime="2025-10-06 12:23:45.363051142 +0000 UTC m=+911.912756917" watchObservedRunningTime="2025-10-06 12:23:45.369493802 +0000 UTC m=+911.919199577" Oct 06 12:23:46 crc kubenswrapper[4892]: I1006 12:23:46.343153 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" event={"ID":"7e7d79cb-0957-4ef5-9b20-f82fd5d288d8","Type":"ContainerStarted","Data":"c684bf2a5df8c44e78e7d62ec37de60958da13fc356c99fc4fd34fff604eb1e0"} Oct 06 12:23:46 crc kubenswrapper[4892]: I1006 12:23:46.369147 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" podStartSLOduration=2.179827736 podStartE2EDuration="27.369123873s" podCreationTimestamp="2025-10-06 12:23:19 +0000 UTC" firstStartedPulling="2025-10-06 12:23:20.731174807 +0000 UTC m=+887.280880572" lastFinishedPulling="2025-10-06 12:23:45.920470934 +0000 UTC m=+912.470176709" observedRunningTime="2025-10-06 12:23:46.363214038 +0000 UTC m=+912.912919813" watchObservedRunningTime="2025-10-06 12:23:46.369123873 +0000 UTC m=+912.918829668" Oct 06 12:23:49 crc kubenswrapper[4892]: I1006 12:23:49.610121 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" Oct 06 12:23:49 crc kubenswrapper[4892]: I1006 12:23:49.647753 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-7jqz4" Oct 06 12:23:49 crc kubenswrapper[4892]: I1006 12:23:49.734865 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-5gfhv" Oct 06 12:23:49 crc kubenswrapper[4892]: I1006 12:23:49.755512 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-wht6n" Oct 06 12:23:49 crc kubenswrapper[4892]: I1006 12:23:49.920474 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-bdx4w" Oct 06 12:23:59 crc kubenswrapper[4892]: I1006 12:23:59.614125 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-lvcfv" Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.928928 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66cfd9876c-9bjc8"] Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.931691 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.934217 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.934417 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.934763 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.935003 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-h7f65" Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.935514 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cfd9876c-9bjc8"] Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.975516 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65494c5d5-dlnsf"] Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.976639 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.979395 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 06 12:24:18 crc kubenswrapper[4892]: I1006 12:24:18.994750 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65494c5d5-dlnsf"] Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.077844 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7fb0796-3649-439d-b1ae-023f6204fe75-config\") pod \"dnsmasq-dns-66cfd9876c-9bjc8\" (UID: \"f7fb0796-3649-439d-b1ae-023f6204fe75\") " pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.078013 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx5qk\" (UniqueName: \"kubernetes.io/projected/319fa89a-7cb3-4e55-b741-f001495454b4-kube-api-access-sx5qk\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.078203 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7264m\" (UniqueName: \"kubernetes.io/projected/f7fb0796-3649-439d-b1ae-023f6204fe75-kube-api-access-7264m\") pod \"dnsmasq-dns-66cfd9876c-9bjc8\" (UID: \"f7fb0796-3649-439d-b1ae-023f6204fe75\") " pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.078315 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-dns-svc\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.078460 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-config\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.179681 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-config\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.179733 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7fb0796-3649-439d-b1ae-023f6204fe75-config\") pod \"dnsmasq-dns-66cfd9876c-9bjc8\" (UID: \"f7fb0796-3649-439d-b1ae-023f6204fe75\") " pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.179754 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx5qk\" (UniqueName: \"kubernetes.io/projected/319fa89a-7cb3-4e55-b741-f001495454b4-kube-api-access-sx5qk\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.179828 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7264m\" (UniqueName: \"kubernetes.io/projected/f7fb0796-3649-439d-b1ae-023f6204fe75-kube-api-access-7264m\") pod \"dnsmasq-dns-66cfd9876c-9bjc8\" (UID: \"f7fb0796-3649-439d-b1ae-023f6204fe75\") " pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.179851 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-dns-svc\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.180855 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-dns-svc\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.180870 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-config\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.181048 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7fb0796-3649-439d-b1ae-023f6204fe75-config\") pod \"dnsmasq-dns-66cfd9876c-9bjc8\" (UID: \"f7fb0796-3649-439d-b1ae-023f6204fe75\") " pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.197230 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7264m\" (UniqueName: \"kubernetes.io/projected/f7fb0796-3649-439d-b1ae-023f6204fe75-kube-api-access-7264m\") pod \"dnsmasq-dns-66cfd9876c-9bjc8\" (UID: \"f7fb0796-3649-439d-b1ae-023f6204fe75\") " pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.198067 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx5qk\" (UniqueName: \"kubernetes.io/projected/319fa89a-7cb3-4e55-b741-f001495454b4-kube-api-access-sx5qk\") pod \"dnsmasq-dns-65494c5d5-dlnsf\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.249093 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.297590 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.783875 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66cfd9876c-9bjc8"] Oct 06 12:24:19 crc kubenswrapper[4892]: I1006 12:24:19.798698 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65494c5d5-dlnsf"] Oct 06 12:24:19 crc kubenswrapper[4892]: W1006 12:24:19.798851 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7fb0796_3649_439d_b1ae_023f6204fe75.slice/crio-39d1a24cb8c1815cc6a029df1ae2aed7b50e01abdac89f2f9913b350983e198d WatchSource:0}: Error finding container 39d1a24cb8c1815cc6a029df1ae2aed7b50e01abdac89f2f9913b350983e198d: Status 404 returned error can't find the container with id 39d1a24cb8c1815cc6a029df1ae2aed7b50e01abdac89f2f9913b350983e198d Oct 06 12:24:19 crc kubenswrapper[4892]: W1006 12:24:19.803263 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod319fa89a_7cb3_4e55_b741_f001495454b4.slice/crio-fae6aa5dcf57000a64707f9333e57a0389c93f40e29cce87073a9c4d1d212440 WatchSource:0}: Error finding container fae6aa5dcf57000a64707f9333e57a0389c93f40e29cce87073a9c4d1d212440: Status 404 returned error can't find the container with id fae6aa5dcf57000a64707f9333e57a0389c93f40e29cce87073a9c4d1d212440 Oct 06 12:24:20 crc kubenswrapper[4892]: I1006 12:24:20.671289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" event={"ID":"319fa89a-7cb3-4e55-b741-f001495454b4","Type":"ContainerStarted","Data":"fae6aa5dcf57000a64707f9333e57a0389c93f40e29cce87073a9c4d1d212440"} Oct 06 12:24:20 crc kubenswrapper[4892]: I1006 12:24:20.672641 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" event={"ID":"f7fb0796-3649-439d-b1ae-023f6204fe75","Type":"ContainerStarted","Data":"39d1a24cb8c1815cc6a029df1ae2aed7b50e01abdac89f2f9913b350983e198d"} Oct 06 12:24:22 crc kubenswrapper[4892]: I1006 12:24:22.979913 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65494c5d5-dlnsf"] Oct 06 12:24:22 crc kubenswrapper[4892]: I1006 12:24:22.999868 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57cf8c6957-l49lh"] Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.002066 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.007731 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57cf8c6957-l49lh"] Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.138197 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkthx\" (UniqueName: \"kubernetes.io/projected/d7343495-cff2-4968-8b72-2153c84f0a54-kube-api-access-wkthx\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.138249 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-dns-svc\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.138277 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-config\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.228985 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cfd9876c-9bjc8"] Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.239862 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkthx\" (UniqueName: \"kubernetes.io/projected/d7343495-cff2-4968-8b72-2153c84f0a54-kube-api-access-wkthx\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.240017 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-dns-svc\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.240072 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-config\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.240952 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-dns-svc\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.243345 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-config\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.250823 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8f8dc5f77-86rgb"] Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.252004 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.272484 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8f8dc5f77-86rgb"] Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.273465 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkthx\" (UniqueName: \"kubernetes.io/projected/d7343495-cff2-4968-8b72-2153c84f0a54-kube-api-access-wkthx\") pod \"dnsmasq-dns-57cf8c6957-l49lh\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.322850 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.340965 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-config\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.341001 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-dns-svc\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.341077 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdj2t\" (UniqueName: \"kubernetes.io/projected/d54376a9-b390-48c7-a32f-60d1d73b93d0-kube-api-access-tdj2t\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.442354 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-config\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.442393 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-dns-svc\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.442451 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdj2t\" (UniqueName: \"kubernetes.io/projected/d54376a9-b390-48c7-a32f-60d1d73b93d0-kube-api-access-tdj2t\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.443178 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-config\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.443351 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-dns-svc\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.460573 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdj2t\" (UniqueName: \"kubernetes.io/projected/d54376a9-b390-48c7-a32f-60d1d73b93d0-kube-api-access-tdj2t\") pod \"dnsmasq-dns-8f8dc5f77-86rgb\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.505139 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57cf8c6957-l49lh"] Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.520900 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc64b8dc7-tlr89"] Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.522267 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.534847 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc64b8dc7-tlr89"] Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.590136 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.645675 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-dns-svc\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.645732 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjzkr\" (UniqueName: \"kubernetes.io/projected/fa73d237-4fcb-45b6-b394-fb9295df0e2d-kube-api-access-bjzkr\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.645766 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-config\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.747084 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-dns-svc\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.747162 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjzkr\" (UniqueName: \"kubernetes.io/projected/fa73d237-4fcb-45b6-b394-fb9295df0e2d-kube-api-access-bjzkr\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.747203 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-config\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.747999 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-dns-svc\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.748214 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-config\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.770488 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjzkr\" (UniqueName: \"kubernetes.io/projected/fa73d237-4fcb-45b6-b394-fb9295df0e2d-kube-api-access-bjzkr\") pod \"dnsmasq-dns-5dc64b8dc7-tlr89\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:23 crc kubenswrapper[4892]: I1006 12:24:23.883161 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.142580 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.144382 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.147586 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.147739 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.148964 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.149455 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.150908 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.151233 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.152429 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-sd5cd" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.154103 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264635 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264694 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264739 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/000efd26-a8c0-4668-9603-9ee7a9aed0ed-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264768 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw7cq\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-kube-api-access-fw7cq\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264811 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264825 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264852 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264927 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.264963 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.265003 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/000efd26-a8c0-4668-9603-9ee7a9aed0ed-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369378 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369453 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369477 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/000efd26-a8c0-4668-9603-9ee7a9aed0ed-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7cq\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-kube-api-access-fw7cq\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369603 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369647 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369673 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369736 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369802 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369846 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/000efd26-a8c0-4668-9603-9ee7a9aed0ed-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.369903 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.371098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.371859 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.372380 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/000efd26-a8c0-4668-9603-9ee7a9aed0ed-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.373024 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.373922 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.373978 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.374212 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.375801 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.378414 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.378569 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.381341 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/000efd26-a8c0-4668-9603-9ee7a9aed0ed-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.382533 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.382775 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.382891 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mrf4b" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.383042 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/000efd26-a8c0-4668-9603-9ee7a9aed0ed-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.383259 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.386437 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.387534 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.394247 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.394589 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.398076 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7cq\" (UniqueName: \"kubernetes.io/projected/000efd26-a8c0-4668-9603-9ee7a9aed0ed-kube-api-access-fw7cq\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.411597 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"000efd26-a8c0-4668-9603-9ee7a9aed0ed\") " pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471715 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471777 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471831 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471860 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471886 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471916 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471951 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471975 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.471997 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.472040 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zlt6\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-kube-api-access-9zlt6\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.472067 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.484080 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.573727 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.573784 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.573826 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.573852 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.573879 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.573957 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zlt6\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-kube-api-access-9zlt6\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.573985 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.574027 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.574034 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.574054 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.574896 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.574956 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.575023 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-config-data\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.575406 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.575998 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.576066 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.576893 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.578511 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.580578 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.580904 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.590673 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zlt6\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-kube-api-access-9zlt6\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.592434 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.623696 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.656380 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.658879 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.660687 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.661213 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-zllld" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.661994 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.662137 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.662279 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.662452 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.662553 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.692130 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777638 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777700 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777753 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777774 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5pk\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-kube-api-access-gx5pk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777808 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bcc90cdb-7f84-4923-9eef-4fae34199b75-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777833 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777879 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777900 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bcc90cdb-7f84-4923-9eef-4fae34199b75-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777929 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.777954 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.778009 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.786815 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881670 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881741 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881774 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881827 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881851 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5pk\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-kube-api-access-gx5pk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881882 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bcc90cdb-7f84-4923-9eef-4fae34199b75-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881958 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.881978 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bcc90cdb-7f84-4923-9eef-4fae34199b75-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.882007 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.882033 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.882842 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.883016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.883173 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.883539 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.885220 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.885880 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.885890 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bcc90cdb-7f84-4923-9eef-4fae34199b75-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.886459 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bcc90cdb-7f84-4923-9eef-4fae34199b75-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.888064 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.889363 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.909018 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5pk\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-kube-api-access-gx5pk\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:24 crc kubenswrapper[4892]: I1006 12:24:24.912794 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:25 crc kubenswrapper[4892]: I1006 12:24:25.011561 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.705434 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.708211 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.713397 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zdwc2" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.713612 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.713768 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.714145 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.714562 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.719817 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.720602 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.823026 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.825312 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.830513 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-secrets\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.830633 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-config-data-default\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.830720 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-kolla-config\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.830760 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831002 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831061 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831100 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/728515b5-40b3-48f4-8452-85ce84a9930a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831188 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831238 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmqz\" (UniqueName: \"kubernetes.io/projected/728515b5-40b3-48f4-8452-85ce84a9930a-kube-api-access-9xmqz\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831422 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831548 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831628 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.831763 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-s25h5" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.844018 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934367 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htcss\" (UniqueName: \"kubernetes.io/projected/19207559-7eb7-49b5-9b73-0641f426ab63-kube-api-access-htcss\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934419 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934439 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934459 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmqz\" (UniqueName: \"kubernetes.io/projected/728515b5-40b3-48f4-8452-85ce84a9930a-kube-api-access-9xmqz\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934489 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-secrets\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934516 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19207559-7eb7-49b5-9b73-0641f426ab63-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934530 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934551 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-config-data-default\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934568 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934599 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934620 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-kolla-config\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934634 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934657 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934705 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934753 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934787 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934802 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.934821 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/728515b5-40b3-48f4-8452-85ce84a9930a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.935194 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/728515b5-40b3-48f4-8452-85ce84a9930a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.935548 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.936617 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-kolla-config\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.936724 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.937443 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/728515b5-40b3-48f4-8452-85ce84a9930a-config-data-default\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.950262 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-secrets\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.950375 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.951219 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/728515b5-40b3-48f4-8452-85ce84a9930a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.958228 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmqz\" (UniqueName: \"kubernetes.io/projected/728515b5-40b3-48f4-8452-85ce84a9930a-kube-api-access-9xmqz\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:27 crc kubenswrapper[4892]: I1006 12:24:27.961449 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"728515b5-40b3-48f4-8452-85ce84a9930a\") " pod="openstack/openstack-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036067 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htcss\" (UniqueName: \"kubernetes.io/projected/19207559-7eb7-49b5-9b73-0641f426ab63-kube-api-access-htcss\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036121 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036170 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19207559-7eb7-49b5-9b73-0641f426ab63-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036196 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036227 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036315 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036365 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036403 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036564 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.036750 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/19207559-7eb7-49b5-9b73-0641f426ab63-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.037009 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.037421 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.039288 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19207559-7eb7-49b5-9b73-0641f426ab63-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.041542 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.042654 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.045738 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/19207559-7eb7-49b5-9b73-0641f426ab63-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.052528 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.065199 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htcss\" (UniqueName: \"kubernetes.io/projected/19207559-7eb7-49b5-9b73-0641f426ab63-kube-api-access-htcss\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.075419 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"19207559-7eb7-49b5-9b73-0641f426ab63\") " pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.145223 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.186564 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.187935 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.192308 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.192541 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.192782 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4pcgl" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.199676 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.241549 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsr9p\" (UniqueName: \"kubernetes.io/projected/82b202ad-f5d6-406b-9821-3a4a18c795eb-kube-api-access-qsr9p\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.241650 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b202ad-f5d6-406b-9821-3a4a18c795eb-config-data\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.241679 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/82b202ad-f5d6-406b-9821-3a4a18c795eb-kolla-config\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.241713 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b202ad-f5d6-406b-9821-3a4a18c795eb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.241748 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b202ad-f5d6-406b-9821-3a4a18c795eb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.342701 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsr9p\" (UniqueName: \"kubernetes.io/projected/82b202ad-f5d6-406b-9821-3a4a18c795eb-kube-api-access-qsr9p\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.343065 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b202ad-f5d6-406b-9821-3a4a18c795eb-config-data\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.343098 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/82b202ad-f5d6-406b-9821-3a4a18c795eb-kolla-config\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.343129 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b202ad-f5d6-406b-9821-3a4a18c795eb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.343166 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b202ad-f5d6-406b-9821-3a4a18c795eb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.344062 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82b202ad-f5d6-406b-9821-3a4a18c795eb-config-data\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.344692 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/82b202ad-f5d6-406b-9821-3a4a18c795eb-kolla-config\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.351062 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/82b202ad-f5d6-406b-9821-3a4a18c795eb-memcached-tls-certs\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.352816 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82b202ad-f5d6-406b-9821-3a4a18c795eb-combined-ca-bundle\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.360561 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsr9p\" (UniqueName: \"kubernetes.io/projected/82b202ad-f5d6-406b-9821-3a4a18c795eb-kube-api-access-qsr9p\") pod \"memcached-0\" (UID: \"82b202ad-f5d6-406b-9821-3a4a18c795eb\") " pod="openstack/memcached-0" Oct 06 12:24:28 crc kubenswrapper[4892]: I1006 12:24:28.511251 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 06 12:24:29 crc kubenswrapper[4892]: I1006 12:24:29.922126 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:24:29 crc kubenswrapper[4892]: I1006 12:24:29.923111 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 12:24:29 crc kubenswrapper[4892]: I1006 12:24:29.925328 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n29j8" Oct 06 12:24:29 crc kubenswrapper[4892]: I1006 12:24:29.938688 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:24:29 crc kubenswrapper[4892]: I1006 12:24:29.961978 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xprcl\" (UniqueName: \"kubernetes.io/projected/77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38-kube-api-access-xprcl\") pod \"kube-state-metrics-0\" (UID: \"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38\") " pod="openstack/kube-state-metrics-0" Oct 06 12:24:30 crc kubenswrapper[4892]: I1006 12:24:30.064226 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xprcl\" (UniqueName: \"kubernetes.io/projected/77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38-kube-api-access-xprcl\") pod \"kube-state-metrics-0\" (UID: \"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38\") " pod="openstack/kube-state-metrics-0" Oct 06 12:24:30 crc kubenswrapper[4892]: I1006 12:24:30.110012 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xprcl\" (UniqueName: \"kubernetes.io/projected/77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38-kube-api-access-xprcl\") pod \"kube-state-metrics-0\" (UID: \"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38\") " pod="openstack/kube-state-metrics-0" Oct 06 12:24:30 crc kubenswrapper[4892]: I1006 12:24:30.242719 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.262803 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.265304 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.269163 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.269189 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.269239 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.269432 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.269624 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-x7smc" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.277819 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.288989 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.382214 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83392b37-0087-4c7c-ab0d-91af0c170445-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.382288 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.382311 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.382352 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.382399 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-config\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.382450 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r562t\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-kube-api-access-r562t\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.382537 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83392b37-0087-4c7c-ab0d-91af0c170445-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.382595 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.484475 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.484524 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.484558 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.484624 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-config\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.484686 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r562t\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-kube-api-access-r562t\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.484708 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83392b37-0087-4c7c-ab0d-91af0c170445-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.484725 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.484772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83392b37-0087-4c7c-ab0d-91af0c170445-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.489931 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83392b37-0087-4c7c-ab0d-91af0c170445-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.492532 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83392b37-0087-4c7c-ab0d-91af0c170445-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.496004 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.496050 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b0f201420d0adafcb475a965fbfd99b4a272413cc10e31ea76ae8257a696a4f5/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.497795 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.498186 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-config\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.499174 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.508402 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r562t\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-kube-api-access-r562t\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.522971 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.553623 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:31 crc kubenswrapper[4892]: I1006 12:24:31.588193 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.174520 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.177534 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.183521 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.183914 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pn66q" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.184384 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.185108 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.186021 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.192924 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.321947 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.321989 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.322010 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.322049 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dvnw\" (UniqueName: \"kubernetes.io/projected/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-kube-api-access-4dvnw\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.322067 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.322092 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.322354 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-config\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.322382 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.424168 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.424225 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.424251 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.424372 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dvnw\" (UniqueName: \"kubernetes.io/projected/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-kube-api-access-4dvnw\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.424816 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.424988 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.425042 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.425090 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-config\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.425124 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.425453 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.425885 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-config\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.426683 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.436950 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.441344 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.444940 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.456956 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dvnw\" (UniqueName: \"kubernetes.io/projected/516888f8-ccb5-4bbc-b11d-c09a8dbd19b1-kube-api-access-4dvnw\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.461504 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1\") " pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.504529 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.988273 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l6mw2"] Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.990081 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.993051 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xkhtj" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.993290 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 06 12:24:33 crc kubenswrapper[4892]: I1006 12:24:33.993770 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.010472 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qtgzv"] Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.012091 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.030555 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l6mw2"] Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.037785 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qtgzv"] Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.137706 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-run-ovn\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.137776 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-lib\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.137811 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n5n6\" (UniqueName: \"kubernetes.io/projected/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-kube-api-access-6n5n6\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.137848 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-ovn-controller-tls-certs\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.137875 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20f13b95-c224-4a4d-acd3-ad229e3223fb-scripts\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.137902 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-etc-ovs\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.138066 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-combined-ca-bundle\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.138215 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-log-ovn\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.138253 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-run\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.138275 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-scripts\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.138577 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7gx\" (UniqueName: \"kubernetes.io/projected/20f13b95-c224-4a4d-acd3-ad229e3223fb-kube-api-access-pq7gx\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.138711 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-log\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.138753 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-run\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240066 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-log-ovn\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240141 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-run\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240164 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-scripts\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240238 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7gx\" (UniqueName: \"kubernetes.io/projected/20f13b95-c224-4a4d-acd3-ad229e3223fb-kube-api-access-pq7gx\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240284 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-log\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240303 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-run\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240360 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-run-ovn\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240380 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-lib\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240398 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n5n6\" (UniqueName: \"kubernetes.io/projected/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-kube-api-access-6n5n6\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240419 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-ovn-controller-tls-certs\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240512 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20f13b95-c224-4a4d-acd3-ad229e3223fb-scripts\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240600 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-etc-ovs\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240627 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-combined-ca-bundle\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240842 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-run\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.241008 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-run-ovn\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.240842 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-run\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.241023 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-log\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.241191 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-var-lib\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.241234 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/20f13b95-c224-4a4d-acd3-ad229e3223fb-etc-ovs\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.241363 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-var-log-ovn\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.242176 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.244352 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.248465 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-combined-ca-bundle\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.252391 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-scripts\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.253674 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20f13b95-c224-4a4d-acd3-ad229e3223fb-scripts\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.255208 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7gx\" (UniqueName: \"kubernetes.io/projected/20f13b95-c224-4a4d-acd3-ad229e3223fb-kube-api-access-pq7gx\") pod \"ovn-controller-ovs-qtgzv\" (UID: \"20f13b95-c224-4a4d-acd3-ad229e3223fb\") " pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.255292 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-ovn-controller-tls-certs\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.258611 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n5n6\" (UniqueName: \"kubernetes.io/projected/cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8-kube-api-access-6n5n6\") pod \"ovn-controller-l6mw2\" (UID: \"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8\") " pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.317784 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-xkhtj" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.325039 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:34 crc kubenswrapper[4892]: I1006 12:24:34.333748 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.286515 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.290028 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.292674 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.292999 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.293300 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-phxlv" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.293339 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.304602 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.402427 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.402496 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4b1e93-1c14-436a-84e3-7d9359228563-config\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.402621 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.402654 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.402677 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.402698 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae4b1e93-1c14-436a-84e3-7d9359228563-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.402718 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ae4b1e93-1c14-436a-84e3-7d9359228563-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.402742 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk587\" (UniqueName: \"kubernetes.io/projected/ae4b1e93-1c14-436a-84e3-7d9359228563-kube-api-access-zk587\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.449974 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.504804 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.504871 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.504912 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.504938 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae4b1e93-1c14-436a-84e3-7d9359228563-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.504964 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ae4b1e93-1c14-436a-84e3-7d9359228563-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.505105 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk587\" (UniqueName: \"kubernetes.io/projected/ae4b1e93-1c14-436a-84e3-7d9359228563-kube-api-access-zk587\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.505142 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.505314 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.505833 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ae4b1e93-1c14-436a-84e3-7d9359228563-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.506212 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae4b1e93-1c14-436a-84e3-7d9359228563-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.508145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4b1e93-1c14-436a-84e3-7d9359228563-config\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.511954 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.513537 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.514034 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae4b1e93-1c14-436a-84e3-7d9359228563-config\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.515697 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae4b1e93-1c14-436a-84e3-7d9359228563-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.525542 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk587\" (UniqueName: \"kubernetes.io/projected/ae4b1e93-1c14-436a-84e3-7d9359228563-kube-api-access-zk587\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.532891 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ae4b1e93-1c14-436a-84e3-7d9359228563\") " pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: I1006 12:24:37.610151 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:37 crc kubenswrapper[4892]: E1006 12:24:37.818011 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.98:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 06 12:24:37 crc kubenswrapper[4892]: E1006 12:24:37.818073 4892 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.98:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 06 12:24:37 crc kubenswrapper[4892]: E1006 12:24:37.818179 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.98:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7264m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-66cfd9876c-9bjc8_openstack(f7fb0796-3649-439d-b1ae-023f6204fe75): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 12:24:37 crc kubenswrapper[4892]: E1006 12:24:37.821315 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" podUID="f7fb0796-3649-439d-b1ae-023f6204fe75" Oct 06 12:24:37 crc kubenswrapper[4892]: E1006 12:24:37.959121 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.98:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 06 12:24:37 crc kubenswrapper[4892]: E1006 12:24:37.959528 4892 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.98:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 06 12:24:37 crc kubenswrapper[4892]: E1006 12:24:37.959667 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.98:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sx5qk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-65494c5d5-dlnsf_openstack(319fa89a-7cb3-4e55-b741-f001495454b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 12:24:37 crc kubenswrapper[4892]: E1006 12:24:37.961184 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" podUID="319fa89a-7cb3-4e55-b741-f001495454b4" Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.375626 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.539740 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7264m\" (UniqueName: \"kubernetes.io/projected/f7fb0796-3649-439d-b1ae-023f6204fe75-kube-api-access-7264m\") pod \"f7fb0796-3649-439d-b1ae-023f6204fe75\" (UID: \"f7fb0796-3649-439d-b1ae-023f6204fe75\") " Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.539850 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7fb0796-3649-439d-b1ae-023f6204fe75-config\") pod \"f7fb0796-3649-439d-b1ae-023f6204fe75\" (UID: \"f7fb0796-3649-439d-b1ae-023f6204fe75\") " Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.540641 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7fb0796-3649-439d-b1ae-023f6204fe75-config" (OuterVolumeSpecName: "config") pod "f7fb0796-3649-439d-b1ae-023f6204fe75" (UID: "f7fb0796-3649-439d-b1ae-023f6204fe75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.550527 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7fb0796-3649-439d-b1ae-023f6204fe75-kube-api-access-7264m" (OuterVolumeSpecName: "kube-api-access-7264m") pod "f7fb0796-3649-439d-b1ae-023f6204fe75" (UID: "f7fb0796-3649-439d-b1ae-023f6204fe75"). InnerVolumeSpecName "kube-api-access-7264m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.643460 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7fb0796-3649-439d-b1ae-023f6204fe75-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.643491 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7264m\" (UniqueName: \"kubernetes.io/projected/f7fb0796-3649-439d-b1ae-023f6204fe75-kube-api-access-7264m\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.711564 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.728267 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8f8dc5f77-86rgb"] Oct 06 12:24:38 crc kubenswrapper[4892]: W1006 12:24:38.729690 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod000efd26_a8c0_4668_9603_9ee7a9aed0ed.slice/crio-7be5778594498cae902305af0164eb0a562c14c745cbe7732c1002e4f20467a9 WatchSource:0}: Error finding container 7be5778594498cae902305af0164eb0a562c14c745cbe7732c1002e4f20467a9: Status 404 returned error can't find the container with id 7be5778594498cae902305af0164eb0a562c14c745cbe7732c1002e4f20467a9 Oct 06 12:24:38 crc kubenswrapper[4892]: W1006 12:24:38.731755 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbae4d7a_3312_45b6_8af2_dd9e7ae8bf4c.slice/crio-9e6a7a5aea9cb8a9e54ec08bbcb82a74889370b873a5a0ee7c37083195a97d25 WatchSource:0}: Error finding container 9e6a7a5aea9cb8a9e54ec08bbcb82a74889370b873a5a0ee7c37083195a97d25: Status 404 returned error can't find the container with id 9e6a7a5aea9cb8a9e54ec08bbcb82a74889370b873a5a0ee7c37083195a97d25 Oct 06 12:24:38 crc kubenswrapper[4892]: W1006 12:24:38.733917 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd54376a9_b390_48c7_a32f_60d1d73b93d0.slice/crio-3802433bb64648c3b307368da8c3fe20d9643552d2e4b8b8262bbdf444b6ec79 WatchSource:0}: Error finding container 3802433bb64648c3b307368da8c3fe20d9643552d2e4b8b8262bbdf444b6ec79: Status 404 returned error can't find the container with id 3802433bb64648c3b307368da8c3fe20d9643552d2e4b8b8262bbdf444b6ec79 Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.735866 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.852117 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c","Type":"ContainerStarted","Data":"9e6a7a5aea9cb8a9e54ec08bbcb82a74889370b873a5a0ee7c37083195a97d25"} Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.853455 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"000efd26-a8c0-4668-9603-9ee7a9aed0ed","Type":"ContainerStarted","Data":"7be5778594498cae902305af0164eb0a562c14c745cbe7732c1002e4f20467a9"} Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.855271 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" event={"ID":"f7fb0796-3649-439d-b1ae-023f6204fe75","Type":"ContainerDied","Data":"39d1a24cb8c1815cc6a029df1ae2aed7b50e01abdac89f2f9913b350983e198d"} Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.855291 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66cfd9876c-9bjc8" Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.857163 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" event={"ID":"d54376a9-b390-48c7-a32f-60d1d73b93d0","Type":"ContainerStarted","Data":"3802433bb64648c3b307368da8c3fe20d9643552d2e4b8b8262bbdf444b6ec79"} Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.859407 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bcc90cdb-7f84-4923-9eef-4fae34199b75","Type":"ContainerStarted","Data":"43cf2e3715dba1961e5b0ab735063d4634df1096410b410d099389c3c38012a3"} Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.929835 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66cfd9876c-9bjc8"] Oct 06 12:24:38 crc kubenswrapper[4892]: I1006 12:24:38.942292 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66cfd9876c-9bjc8"] Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.050934 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.069372 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.082362 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.092655 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57cf8c6957-l49lh"] Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.113951 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 06 12:24:39 crc kubenswrapper[4892]: W1006 12:24:39.116017 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77d2ae66_c6bc_42e4_9e5e_cf7a9b9b9d38.slice/crio-612d2715cbadcdec45f9ad80e5f6bc23945132a36a6f2bb273bfbe9d2a462247 WatchSource:0}: Error finding container 612d2715cbadcdec45f9ad80e5f6bc23945132a36a6f2bb273bfbe9d2a462247: Status 404 returned error can't find the container with id 612d2715cbadcdec45f9ad80e5f6bc23945132a36a6f2bb273bfbe9d2a462247 Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.148481 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc64b8dc7-tlr89"] Oct 06 12:24:39 crc kubenswrapper[4892]: W1006 12:24:39.169672 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa73d237_4fcb_45b6_b394_fb9295df0e2d.slice/crio-5a72b6c67a6bc90d156ffffcd8b5622643fcbed71d393266521527760e8e7d1b WatchSource:0}: Error finding container 5a72b6c67a6bc90d156ffffcd8b5622643fcbed71d393266521527760e8e7d1b: Status 404 returned error can't find the container with id 5a72b6c67a6bc90d156ffffcd8b5622643fcbed71d393266521527760e8e7d1b Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.171844 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.193415 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.308863 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qtgzv"] Oct 06 12:24:39 crc kubenswrapper[4892]: W1006 12:24:39.313902 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod516888f8_ccb5_4bbc_b11d_c09a8dbd19b1.slice/crio-0d0b7cc10a541f43a4cb3d9bc8fd3174301cdd615e65399068d0e3c18cabbf17 WatchSource:0}: Error finding container 0d0b7cc10a541f43a4cb3d9bc8fd3174301cdd615e65399068d0e3c18cabbf17: Status 404 returned error can't find the container with id 0d0b7cc10a541f43a4cb3d9bc8fd3174301cdd615e65399068d0e3c18cabbf17 Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.324174 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l6mw2"] Oct 06 12:24:39 crc kubenswrapper[4892]: W1006 12:24:39.345537 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc5ba4f2_f4c1_46af_843e_6d3d4e8556e8.slice/crio-f036984ceac3554f239bf9f9216faecc9528dbb3f2fe1573cc96d5bf5cb16350 WatchSource:0}: Error finding container f036984ceac3554f239bf9f9216faecc9528dbb3f2fe1573cc96d5bf5cb16350: Status 404 returned error can't find the container with id f036984ceac3554f239bf9f9216faecc9528dbb3f2fe1573cc96d5bf5cb16350 Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.394335 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.406231 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.555819 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-config\") pod \"319fa89a-7cb3-4e55-b741-f001495454b4\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.556120 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx5qk\" (UniqueName: \"kubernetes.io/projected/319fa89a-7cb3-4e55-b741-f001495454b4-kube-api-access-sx5qk\") pod \"319fa89a-7cb3-4e55-b741-f001495454b4\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.556224 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-dns-svc\") pod \"319fa89a-7cb3-4e55-b741-f001495454b4\" (UID: \"319fa89a-7cb3-4e55-b741-f001495454b4\") " Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.557221 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "319fa89a-7cb3-4e55-b741-f001495454b4" (UID: "319fa89a-7cb3-4e55-b741-f001495454b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.558153 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-config" (OuterVolumeSpecName: "config") pod "319fa89a-7cb3-4e55-b741-f001495454b4" (UID: "319fa89a-7cb3-4e55-b741-f001495454b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.561706 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/319fa89a-7cb3-4e55-b741-f001495454b4-kube-api-access-sx5qk" (OuterVolumeSpecName: "kube-api-access-sx5qk") pod "319fa89a-7cb3-4e55-b741-f001495454b4" (UID: "319fa89a-7cb3-4e55-b741-f001495454b4"). InnerVolumeSpecName "kube-api-access-sx5qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.657973 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.658009 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx5qk\" (UniqueName: \"kubernetes.io/projected/319fa89a-7cb3-4e55-b741-f001495454b4-kube-api-access-sx5qk\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.658023 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/319fa89a-7cb3-4e55-b741-f001495454b4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.869143 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerStarted","Data":"41fc519563b0c3caf05d2714c5f2ac6d14fc35023fc50691ed951d17d696f052"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.871249 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"82b202ad-f5d6-406b-9821-3a4a18c795eb","Type":"ContainerStarted","Data":"e483df0cf2471c86cf9c3811bab341978e5c273dcbf76ba3766824bda5d53344"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.873672 4892 generic.go:334] "Generic (PLEG): container finished" podID="d7343495-cff2-4968-8b72-2153c84f0a54" containerID="2f0bcea9f37f6194d4dcda699745737c94adbc650279a778074b2fa3894e814b" exitCode=0 Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.873755 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" event={"ID":"d7343495-cff2-4968-8b72-2153c84f0a54","Type":"ContainerDied","Data":"2f0bcea9f37f6194d4dcda699745737c94adbc650279a778074b2fa3894e814b"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.873783 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" event={"ID":"d7343495-cff2-4968-8b72-2153c84f0a54","Type":"ContainerStarted","Data":"aaa7dc1537d3ba954e9a7798a8fde25c4240dbd06eb41d7374285605e8950830"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.878106 4892 generic.go:334] "Generic (PLEG): container finished" podID="d54376a9-b390-48c7-a32f-60d1d73b93d0" containerID="0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54" exitCode=0 Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.878367 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" event={"ID":"d54376a9-b390-48c7-a32f-60d1d73b93d0","Type":"ContainerDied","Data":"0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.879716 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"728515b5-40b3-48f4-8452-85ce84a9930a","Type":"ContainerStarted","Data":"edf95812e809a2a6234eaca13feaeb94dde2a2be1bd39982dbf63061db4620af"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.881479 4892 generic.go:334] "Generic (PLEG): container finished" podID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" containerID="904dc31b2872529922d0f3c005434d08c258c066009006805ceb1359ad703f0b" exitCode=0 Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.881528 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" event={"ID":"fa73d237-4fcb-45b6-b394-fb9295df0e2d","Type":"ContainerDied","Data":"904dc31b2872529922d0f3c005434d08c258c066009006805ceb1359ad703f0b"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.881545 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" event={"ID":"fa73d237-4fcb-45b6-b394-fb9295df0e2d","Type":"ContainerStarted","Data":"5a72b6c67a6bc90d156ffffcd8b5622643fcbed71d393266521527760e8e7d1b"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.883223 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qtgzv" event={"ID":"20f13b95-c224-4a4d-acd3-ad229e3223fb","Type":"ContainerStarted","Data":"d16813b4023359eaa4033bd2a139b72056228cfe3207aa9cf1657462819166b6"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.890469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l6mw2" event={"ID":"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8","Type":"ContainerStarted","Data":"f036984ceac3554f239bf9f9216faecc9528dbb3f2fe1573cc96d5bf5cb16350"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.893841 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1","Type":"ContainerStarted","Data":"0d0b7cc10a541f43a4cb3d9bc8fd3174301cdd615e65399068d0e3c18cabbf17"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.896225 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"19207559-7eb7-49b5-9b73-0641f426ab63","Type":"ContainerStarted","Data":"a8ae2ae42398c2adf2b3a6cf580132d9aece83f6993d2d1e8b0fa2d040e896af"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.897452 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38","Type":"ContainerStarted","Data":"612d2715cbadcdec45f9ad80e5f6bc23945132a36a6f2bb273bfbe9d2a462247"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.898840 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ae4b1e93-1c14-436a-84e3-7d9359228563","Type":"ContainerStarted","Data":"9255b6fa5401e28cd56edc779711b81bbcd126a0b13b930148fe48d0912ae8a7"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.900681 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" event={"ID":"319fa89a-7cb3-4e55-b741-f001495454b4","Type":"ContainerDied","Data":"fae6aa5dcf57000a64707f9333e57a0389c93f40e29cce87073a9c4d1d212440"} Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.900750 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65494c5d5-dlnsf" Oct 06 12:24:39 crc kubenswrapper[4892]: I1006 12:24:39.993229 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65494c5d5-dlnsf"] Oct 06 12:24:40 crc kubenswrapper[4892]: I1006 12:24:40.002613 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65494c5d5-dlnsf"] Oct 06 12:24:40 crc kubenswrapper[4892]: I1006 12:24:40.177708 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="319fa89a-7cb3-4e55-b741-f001495454b4" path="/var/lib/kubelet/pods/319fa89a-7cb3-4e55-b741-f001495454b4/volumes" Oct 06 12:24:40 crc kubenswrapper[4892]: I1006 12:24:40.178181 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7fb0796-3649-439d-b1ae-023f6204fe75" path="/var/lib/kubelet/pods/f7fb0796-3649-439d-b1ae-023f6204fe75/volumes" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.431297 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.555622 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-config\") pod \"d7343495-cff2-4968-8b72-2153c84f0a54\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.555745 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkthx\" (UniqueName: \"kubernetes.io/projected/d7343495-cff2-4968-8b72-2153c84f0a54-kube-api-access-wkthx\") pod \"d7343495-cff2-4968-8b72-2153c84f0a54\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.555911 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-dns-svc\") pod \"d7343495-cff2-4968-8b72-2153c84f0a54\" (UID: \"d7343495-cff2-4968-8b72-2153c84f0a54\") " Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.563500 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7343495-cff2-4968-8b72-2153c84f0a54-kube-api-access-wkthx" (OuterVolumeSpecName: "kube-api-access-wkthx") pod "d7343495-cff2-4968-8b72-2153c84f0a54" (UID: "d7343495-cff2-4968-8b72-2153c84f0a54"). InnerVolumeSpecName "kube-api-access-wkthx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.580154 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7343495-cff2-4968-8b72-2153c84f0a54" (UID: "d7343495-cff2-4968-8b72-2153c84f0a54"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.582298 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-config" (OuterVolumeSpecName: "config") pod "d7343495-cff2-4968-8b72-2153c84f0a54" (UID: "d7343495-cff2-4968-8b72-2153c84f0a54"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.659740 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.659830 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkthx\" (UniqueName: \"kubernetes.io/projected/d7343495-cff2-4968-8b72-2153c84f0a54-kube-api-access-wkthx\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.659852 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7343495-cff2-4968-8b72-2153c84f0a54-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.935075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" event={"ID":"d7343495-cff2-4968-8b72-2153c84f0a54","Type":"ContainerDied","Data":"aaa7dc1537d3ba954e9a7798a8fde25c4240dbd06eb41d7374285605e8950830"} Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.935161 4892 scope.go:117] "RemoveContainer" containerID="2f0bcea9f37f6194d4dcda699745737c94adbc650279a778074b2fa3894e814b" Oct 06 12:24:43 crc kubenswrapper[4892]: I1006 12:24:43.935851 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57cf8c6957-l49lh" Oct 06 12:24:44 crc kubenswrapper[4892]: I1006 12:24:44.021288 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57cf8c6957-l49lh"] Oct 06 12:24:44 crc kubenswrapper[4892]: I1006 12:24:44.031014 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57cf8c6957-l49lh"] Oct 06 12:24:44 crc kubenswrapper[4892]: I1006 12:24:44.193995 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7343495-cff2-4968-8b72-2153c84f0a54" path="/var/lib/kubelet/pods/d7343495-cff2-4968-8b72-2153c84f0a54/volumes" Oct 06 12:24:49 crc kubenswrapper[4892]: I1006 12:24:49.000677 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" event={"ID":"d54376a9-b390-48c7-a32f-60d1d73b93d0","Type":"ContainerStarted","Data":"fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144"} Oct 06 12:24:49 crc kubenswrapper[4892]: I1006 12:24:49.001302 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:49 crc kubenswrapper[4892]: I1006 12:24:49.045284 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" podStartSLOduration=25.932255273 podStartE2EDuration="26.045254763s" podCreationTimestamp="2025-10-06 12:24:23 +0000 UTC" firstStartedPulling="2025-10-06 12:24:38.738823064 +0000 UTC m=+965.288528829" lastFinishedPulling="2025-10-06 12:24:38.851822554 +0000 UTC m=+965.401528319" observedRunningTime="2025-10-06 12:24:49.021952642 +0000 UTC m=+975.571658497" watchObservedRunningTime="2025-10-06 12:24:49.045254763 +0000 UTC m=+975.594960548" Oct 06 12:24:50 crc kubenswrapper[4892]: I1006 12:24:50.010461 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"82b202ad-f5d6-406b-9821-3a4a18c795eb","Type":"ContainerStarted","Data":"00196395b6f3de804ae8275c500cf790f562dc12162deade97bf478be7d0bfd5"} Oct 06 12:24:50 crc kubenswrapper[4892]: I1006 12:24:50.010844 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 06 12:24:50 crc kubenswrapper[4892]: I1006 12:24:50.012932 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" event={"ID":"fa73d237-4fcb-45b6-b394-fb9295df0e2d","Type":"ContainerStarted","Data":"366cb00b880fa452881ec9da475e87b0ff3f38cd147a18bd615f300578c98171"} Oct 06 12:24:50 crc kubenswrapper[4892]: I1006 12:24:50.028083 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.759942925 podStartE2EDuration="22.028064631s" podCreationTimestamp="2025-10-06 12:24:28 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.134144716 +0000 UTC m=+965.683850471" lastFinishedPulling="2025-10-06 12:24:47.402266412 +0000 UTC m=+973.951972177" observedRunningTime="2025-10-06 12:24:50.025216565 +0000 UTC m=+976.574922350" watchObservedRunningTime="2025-10-06 12:24:50.028064631 +0000 UTC m=+976.577770386" Oct 06 12:24:50 crc kubenswrapper[4892]: I1006 12:24:50.045814 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" podStartSLOduration=27.045796514 podStartE2EDuration="27.045796514s" podCreationTimestamp="2025-10-06 12:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:24:50.038152574 +0000 UTC m=+976.587858339" watchObservedRunningTime="2025-10-06 12:24:50.045796514 +0000 UTC m=+976.595502279" Oct 06 12:24:51 crc kubenswrapper[4892]: I1006 12:24:51.025026 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ae4b1e93-1c14-436a-84e3-7d9359228563","Type":"ContainerStarted","Data":"9f7cb8e10149879003790e0d783e048a396966b89a496808e2641ab07b34c201"} Oct 06 12:24:51 crc kubenswrapper[4892]: I1006 12:24:51.028133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1","Type":"ContainerStarted","Data":"d487c07e508efa77cba25d161e1c1dbce85ce079d048351aabef287d93c58b09"} Oct 06 12:24:51 crc kubenswrapper[4892]: I1006 12:24:51.030989 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"19207559-7eb7-49b5-9b73-0641f426ab63","Type":"ContainerStarted","Data":"32257ac18c2e01d677da5a28176777e406fc2fb6f40edbb3bb8eac11c34b27a7"} Oct 06 12:24:51 crc kubenswrapper[4892]: I1006 12:24:51.031928 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:52 crc kubenswrapper[4892]: I1006 12:24:52.042137 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qtgzv" event={"ID":"20f13b95-c224-4a4d-acd3-ad229e3223fb","Type":"ContainerStarted","Data":"84397e07f8dd1e9c37a16caf975a2eabdb94aee2c9abbc48f996a71e27c71568"} Oct 06 12:24:52 crc kubenswrapper[4892]: I1006 12:24:52.047548 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l6mw2" event={"ID":"cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8","Type":"ContainerStarted","Data":"23d7f06ec4a6490f33f8a569b038682ce38bcbe310841581f4b46bcdbb1824a4"} Oct 06 12:24:52 crc kubenswrapper[4892]: I1006 12:24:52.048007 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-l6mw2" Oct 06 12:24:52 crc kubenswrapper[4892]: I1006 12:24:52.049242 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38","Type":"ContainerStarted","Data":"ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283"} Oct 06 12:24:52 crc kubenswrapper[4892]: I1006 12:24:52.049635 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 12:24:52 crc kubenswrapper[4892]: I1006 12:24:52.052385 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"728515b5-40b3-48f4-8452-85ce84a9930a","Type":"ContainerStarted","Data":"b990c92695e2d6e08f1d9d38279c1c747f05c28f8ef58e77dff3d595620f6964"} Oct 06 12:24:52 crc kubenswrapper[4892]: I1006 12:24:52.100365 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.44398061 podStartE2EDuration="23.100349628s" podCreationTimestamp="2025-10-06 12:24:29 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.139122195 +0000 UTC m=+965.688827960" lastFinishedPulling="2025-10-06 12:24:49.795491213 +0000 UTC m=+976.345196978" observedRunningTime="2025-10-06 12:24:52.098906024 +0000 UTC m=+978.648611789" watchObservedRunningTime="2025-10-06 12:24:52.100349628 +0000 UTC m=+978.650055393" Oct 06 12:24:52 crc kubenswrapper[4892]: I1006 12:24:52.117501 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-l6mw2" podStartSLOduration=9.969570096 podStartE2EDuration="19.117483723s" podCreationTimestamp="2025-10-06 12:24:33 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.348083631 +0000 UTC m=+965.897789396" lastFinishedPulling="2025-10-06 12:24:48.495997238 +0000 UTC m=+975.045703023" observedRunningTime="2025-10-06 12:24:52.111977087 +0000 UTC m=+978.661682862" watchObservedRunningTime="2025-10-06 12:24:52.117483723 +0000 UTC m=+978.667189488" Oct 06 12:24:53 crc kubenswrapper[4892]: I1006 12:24:53.060999 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bcc90cdb-7f84-4923-9eef-4fae34199b75","Type":"ContainerStarted","Data":"7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2"} Oct 06 12:24:53 crc kubenswrapper[4892]: I1006 12:24:53.063406 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c","Type":"ContainerStarted","Data":"f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076"} Oct 06 12:24:53 crc kubenswrapper[4892]: I1006 12:24:53.065729 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"000efd26-a8c0-4668-9603-9ee7a9aed0ed","Type":"ContainerStarted","Data":"4cda7faae9d03cb110f0480c21e359deb1758a2d96ca327e2884fff0bb5b4f5b"} Oct 06 12:24:53 crc kubenswrapper[4892]: I1006 12:24:53.068021 4892 generic.go:334] "Generic (PLEG): container finished" podID="20f13b95-c224-4a4d-acd3-ad229e3223fb" containerID="84397e07f8dd1e9c37a16caf975a2eabdb94aee2c9abbc48f996a71e27c71568" exitCode=0 Oct 06 12:24:53 crc kubenswrapper[4892]: I1006 12:24:53.069231 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qtgzv" event={"ID":"20f13b95-c224-4a4d-acd3-ad229e3223fb","Type":"ContainerDied","Data":"84397e07f8dd1e9c37a16caf975a2eabdb94aee2c9abbc48f996a71e27c71568"} Oct 06 12:24:53 crc kubenswrapper[4892]: I1006 12:24:53.593820 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:53 crc kubenswrapper[4892]: I1006 12:24:53.886526 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:24:53 crc kubenswrapper[4892]: I1006 12:24:53.931875 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8f8dc5f77-86rgb"] Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.099194 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerStarted","Data":"f6e35bb22f41354b2219d72a67a39722041f630b90a98e5e79da11808ed8e91a"} Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.100420 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" podUID="d54376a9-b390-48c7-a32f-60d1d73b93d0" containerName="dnsmasq-dns" containerID="cri-o://fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144" gracePeriod=10 Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.526173 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.675938 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-dns-svc\") pod \"d54376a9-b390-48c7-a32f-60d1d73b93d0\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.676096 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-config\") pod \"d54376a9-b390-48c7-a32f-60d1d73b93d0\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.676201 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdj2t\" (UniqueName: \"kubernetes.io/projected/d54376a9-b390-48c7-a32f-60d1d73b93d0-kube-api-access-tdj2t\") pod \"d54376a9-b390-48c7-a32f-60d1d73b93d0\" (UID: \"d54376a9-b390-48c7-a32f-60d1d73b93d0\") " Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.681057 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d54376a9-b390-48c7-a32f-60d1d73b93d0-kube-api-access-tdj2t" (OuterVolumeSpecName: "kube-api-access-tdj2t") pod "d54376a9-b390-48c7-a32f-60d1d73b93d0" (UID: "d54376a9-b390-48c7-a32f-60d1d73b93d0"). InnerVolumeSpecName "kube-api-access-tdj2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.715261 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-config" (OuterVolumeSpecName: "config") pod "d54376a9-b390-48c7-a32f-60d1d73b93d0" (UID: "d54376a9-b390-48c7-a32f-60d1d73b93d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.728912 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d54376a9-b390-48c7-a32f-60d1d73b93d0" (UID: "d54376a9-b390-48c7-a32f-60d1d73b93d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.778258 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdj2t\" (UniqueName: \"kubernetes.io/projected/d54376a9-b390-48c7-a32f-60d1d73b93d0-kube-api-access-tdj2t\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.778297 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:54 crc kubenswrapper[4892]: I1006 12:24:54.778309 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54376a9-b390-48c7-a32f-60d1d73b93d0-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.106059 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"516888f8-ccb5-4bbc-b11d-c09a8dbd19b1","Type":"ContainerStarted","Data":"3e9db9b9876de0e1db2450f4b6537bebd0b1e0117afa9e2bf7d180d58ab4037a"} Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.109521 4892 generic.go:334] "Generic (PLEG): container finished" podID="d54376a9-b390-48c7-a32f-60d1d73b93d0" containerID="fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144" exitCode=0 Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.109589 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" event={"ID":"d54376a9-b390-48c7-a32f-60d1d73b93d0","Type":"ContainerDied","Data":"fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144"} Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.109593 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.109612 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8f8dc5f77-86rgb" event={"ID":"d54376a9-b390-48c7-a32f-60d1d73b93d0","Type":"ContainerDied","Data":"3802433bb64648c3b307368da8c3fe20d9643552d2e4b8b8262bbdf444b6ec79"} Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.109633 4892 scope.go:117] "RemoveContainer" containerID="fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.112066 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ae4b1e93-1c14-436a-84e3-7d9359228563","Type":"ContainerStarted","Data":"34b526009489d3a4d5f06fea3a773af418b4792598884ca80be2617be83285ab"} Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.118815 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qtgzv" event={"ID":"20f13b95-c224-4a4d-acd3-ad229e3223fb","Type":"ContainerStarted","Data":"6b790182b469d48d741e028094f9f9de6133e2d8e68504c6b61a042bb677efc3"} Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.118862 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.118876 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.118884 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qtgzv" event={"ID":"20f13b95-c224-4a4d-acd3-ad229e3223fb","Type":"ContainerStarted","Data":"a7447a7259cfb0031d0e9c9172424528ec09fb554b6a07ef9ec72f17dea591cd"} Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.133401 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.301487524 podStartE2EDuration="23.133384599s" podCreationTimestamp="2025-10-06 12:24:32 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.325891422 +0000 UTC m=+965.875597187" lastFinishedPulling="2025-10-06 12:24:54.157788497 +0000 UTC m=+980.707494262" observedRunningTime="2025-10-06 12:24:55.13142672 +0000 UTC m=+981.681132485" watchObservedRunningTime="2025-10-06 12:24:55.133384599 +0000 UTC m=+981.683090364" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.134047 4892 scope.go:117] "RemoveContainer" containerID="0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.151712 4892 scope.go:117] "RemoveContainer" containerID="fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144" Oct 06 12:24:55 crc kubenswrapper[4892]: E1006 12:24:55.152227 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144\": container with ID starting with fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144 not found: ID does not exist" containerID="fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.152272 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144"} err="failed to get container status \"fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144\": rpc error: code = NotFound desc = could not find container \"fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144\": container with ID starting with fc0dc7ab0743f6dc983fd1494c20832ef9528dbeb3931709615b8ae5efb63144 not found: ID does not exist" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.152378 4892 scope.go:117] "RemoveContainer" containerID="0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54" Oct 06 12:24:55 crc kubenswrapper[4892]: E1006 12:24:55.152735 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54\": container with ID starting with 0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54 not found: ID does not exist" containerID="0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.152764 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54"} err="failed to get container status \"0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54\": rpc error: code = NotFound desc = could not find container \"0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54\": container with ID starting with 0c0e79174589644cd000adc2caeb8b79989a3bd48f1af33857db72c45a7fbf54 not found: ID does not exist" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.174125 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qtgzv" podStartSLOduration=13.255053809 podStartE2EDuration="22.174107904s" podCreationTimestamp="2025-10-06 12:24:33 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.426698723 +0000 UTC m=+965.976404488" lastFinishedPulling="2025-10-06 12:24:48.345752798 +0000 UTC m=+974.895458583" observedRunningTime="2025-10-06 12:24:55.149176924 +0000 UTC m=+981.698882689" watchObservedRunningTime="2025-10-06 12:24:55.174107904 +0000 UTC m=+981.723813669" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.176169 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.444432788 podStartE2EDuration="19.176162156s" podCreationTimestamp="2025-10-06 12:24:36 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.410542462 +0000 UTC m=+965.960248227" lastFinishedPulling="2025-10-06 12:24:54.14227184 +0000 UTC m=+980.691977595" observedRunningTime="2025-10-06 12:24:55.173666261 +0000 UTC m=+981.723372026" watchObservedRunningTime="2025-10-06 12:24:55.176162156 +0000 UTC m=+981.725867911" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.190044 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8f8dc5f77-86rgb"] Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.195862 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8f8dc5f77-86rgb"] Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.611493 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:55 crc kubenswrapper[4892]: I1006 12:24:55.688504 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.129043 4892 generic.go:334] "Generic (PLEG): container finished" podID="19207559-7eb7-49b5-9b73-0641f426ab63" containerID="32257ac18c2e01d677da5a28176777e406fc2fb6f40edbb3bb8eac11c34b27a7" exitCode=0 Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.129159 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"19207559-7eb7-49b5-9b73-0641f426ab63","Type":"ContainerDied","Data":"32257ac18c2e01d677da5a28176777e406fc2fb6f40edbb3bb8eac11c34b27a7"} Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.134257 4892 generic.go:334] "Generic (PLEG): container finished" podID="728515b5-40b3-48f4-8452-85ce84a9930a" containerID="b990c92695e2d6e08f1d9d38279c1c747f05c28f8ef58e77dff3d595620f6964" exitCode=0 Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.134508 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"728515b5-40b3-48f4-8452-85ce84a9930a","Type":"ContainerDied","Data":"b990c92695e2d6e08f1d9d38279c1c747f05c28f8ef58e77dff3d595620f6964"} Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.135586 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.193459 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d54376a9-b390-48c7-a32f-60d1d73b93d0" path="/var/lib/kubelet/pods/d54376a9-b390-48c7-a32f-60d1d73b93d0/volumes" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.203869 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.507033 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75f8dc6b75-9h4rs"] Oct 06 12:24:56 crc kubenswrapper[4892]: E1006 12:24:56.512148 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54376a9-b390-48c7-a32f-60d1d73b93d0" containerName="init" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.512258 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54376a9-b390-48c7-a32f-60d1d73b93d0" containerName="init" Oct 06 12:24:56 crc kubenswrapper[4892]: E1006 12:24:56.512334 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d54376a9-b390-48c7-a32f-60d1d73b93d0" containerName="dnsmasq-dns" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.512396 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d54376a9-b390-48c7-a32f-60d1d73b93d0" containerName="dnsmasq-dns" Oct 06 12:24:56 crc kubenswrapper[4892]: E1006 12:24:56.512463 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7343495-cff2-4968-8b72-2153c84f0a54" containerName="init" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.512512 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7343495-cff2-4968-8b72-2153c84f0a54" containerName="init" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.512727 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d54376a9-b390-48c7-a32f-60d1d73b93d0" containerName="dnsmasq-dns" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.512789 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7343495-cff2-4968-8b72-2153c84f0a54" containerName="init" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.513744 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.515812 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.516378 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75f8dc6b75-9h4rs"] Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.530471 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xh7kj"] Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.534402 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.538570 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.551381 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xh7kj"] Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.613762 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-ovn-rundir\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.613814 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwwtd\" (UniqueName: \"kubernetes.io/projected/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-kube-api-access-hwwtd\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.613862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-ovs-rundir\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.613889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-config\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.613911 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-config\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.613930 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-dns-svc\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.613962 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-combined-ca-bundle\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.613986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.614008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvt76\" (UniqueName: \"kubernetes.io/projected/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-kube-api-access-gvt76\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.614031 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-ovsdbserver-sb\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.715661 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-ovn-rundir\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.715920 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwwtd\" (UniqueName: \"kubernetes.io/projected/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-kube-api-access-hwwtd\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.715948 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-ovn-rundir\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.716020 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-ovs-rundir\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.715965 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-ovs-rundir\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.716093 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-config\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.716138 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-config\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.716171 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-dns-svc\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.716252 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-combined-ca-bundle\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.716304 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.716349 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvt76\" (UniqueName: \"kubernetes.io/projected/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-kube-api-access-gvt76\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.716396 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-ovsdbserver-sb\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.717015 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-config\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.718005 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-config\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.718551 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-ovsdbserver-sb\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.718909 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-dns-svc\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.721800 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.734899 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-combined-ca-bundle\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.739947 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvt76\" (UniqueName: \"kubernetes.io/projected/ac7b9745-1f4f-4a0f-8803-4ecd222fd160-kube-api-access-gvt76\") pod \"ovn-controller-metrics-xh7kj\" (UID: \"ac7b9745-1f4f-4a0f-8803-4ecd222fd160\") " pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.742507 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwwtd\" (UniqueName: \"kubernetes.io/projected/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-kube-api-access-hwwtd\") pod \"dnsmasq-dns-75f8dc6b75-9h4rs\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.783354 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f8dc6b75-9h4rs"] Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.783861 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.819579 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc86d8fc-pp2hd"] Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.827728 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc86d8fc-pp2hd"] Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.827842 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.831757 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.857454 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xh7kj" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.919504 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-config\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.919715 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-dns-svc\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.919777 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.919927 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkb84\" (UniqueName: \"kubernetes.io/projected/f5de5aa3-3022-4811-854d-6bd0446e5907-kube-api-access-bkb84\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:56 crc kubenswrapper[4892]: I1006 12:24:56.919946 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.020918 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkb84\" (UniqueName: \"kubernetes.io/projected/f5de5aa3-3022-4811-854d-6bd0446e5907-kube-api-access-bkb84\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.020951 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.020973 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-config\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.021000 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-dns-svc\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.021047 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.021824 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.022548 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.023030 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-config\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.023514 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-dns-svc\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.037184 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkb84\" (UniqueName: \"kubernetes.io/projected/f5de5aa3-3022-4811-854d-6bd0446e5907-kube-api-access-bkb84\") pod \"dnsmasq-dns-5dc86d8fc-pp2hd\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.146033 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.154951 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"19207559-7eb7-49b5-9b73-0641f426ab63","Type":"ContainerStarted","Data":"874adb8597bd01a8b07399ab1e5aef8ad5a78371cb25b152b2c05d4ada04fb79"} Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.157510 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"728515b5-40b3-48f4-8452-85ce84a9930a","Type":"ContainerStarted","Data":"f4f7293f5eaf300f8e903c4b13d1faa94bf3ff638e3aa56a4159210fcece11fc"} Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.183151 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.241462634 podStartE2EDuration="31.183133876s" podCreationTimestamp="2025-10-06 12:24:26 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.112128642 +0000 UTC m=+965.661834397" lastFinishedPulling="2025-10-06 12:24:48.053799864 +0000 UTC m=+974.603505639" observedRunningTime="2025-10-06 12:24:57.178496887 +0000 UTC m=+983.728202652" watchObservedRunningTime="2025-10-06 12:24:57.183133876 +0000 UTC m=+983.732839641" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.203329 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.619719114 podStartE2EDuration="31.203301093s" podCreationTimestamp="2025-10-06 12:24:26 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.050648448 +0000 UTC m=+965.600354213" lastFinishedPulling="2025-10-06 12:24:48.634230407 +0000 UTC m=+975.183936192" observedRunningTime="2025-10-06 12:24:57.197492878 +0000 UTC m=+983.747198643" watchObservedRunningTime="2025-10-06 12:24:57.203301093 +0000 UTC m=+983.753006858" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.228498 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f8dc6b75-9h4rs"] Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.356431 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xh7kj"] Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.505613 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.558773 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:57 crc kubenswrapper[4892]: I1006 12:24:57.640704 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc86d8fc-pp2hd"] Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.054143 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.054301 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.146167 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.146244 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.191398 4892 generic.go:334] "Generic (PLEG): container finished" podID="25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" containerID="465578b6ba0ab7cbbb29afe08c89035814678c8db9aad5ac2e6dd137f8d479dd" exitCode=0 Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.192992 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xh7kj" event={"ID":"ac7b9745-1f4f-4a0f-8803-4ecd222fd160","Type":"ContainerStarted","Data":"a515438f592a020688f5aa0d65f59da3bbc3a3de869cb0d86e15cf695b4ed5d8"} Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.193044 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xh7kj" event={"ID":"ac7b9745-1f4f-4a0f-8803-4ecd222fd160","Type":"ContainerStarted","Data":"f980571e20865fd70a3ba0e2325d9ad2ba56515aa4771e087fcff90e85c83410"} Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.193058 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" event={"ID":"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1","Type":"ContainerDied","Data":"465578b6ba0ab7cbbb29afe08c89035814678c8db9aad5ac2e6dd137f8d479dd"} Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.193076 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" event={"ID":"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1","Type":"ContainerStarted","Data":"9e1586fab1327b55f9cf90758ed5a2f414e9ce01e21b0816fe2b18e955008a9a"} Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.195643 4892 generic.go:334] "Generic (PLEG): container finished" podID="f5de5aa3-3022-4811-854d-6bd0446e5907" containerID="16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a" exitCode=0 Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.195689 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" event={"ID":"f5de5aa3-3022-4811-854d-6bd0446e5907","Type":"ContainerDied","Data":"16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a"} Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.197428 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.197707 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" event={"ID":"f5de5aa3-3022-4811-854d-6bd0446e5907","Type":"ContainerStarted","Data":"425306a2936e05ee218db5eb626e470190d237bd11168c0e7296d99e3d06b197"} Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.250417 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xh7kj" podStartSLOduration=2.250399876 podStartE2EDuration="2.250399876s" podCreationTimestamp="2025-10-06 12:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:24:58.212938309 +0000 UTC m=+984.762644074" watchObservedRunningTime="2025-10-06 12:24:58.250399876 +0000 UTC m=+984.800105641" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.276827 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.517523 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.550788 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.552526 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.556711 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-s5x6x" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.556897 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.556909 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.557006 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.580901 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.615008 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.665045 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-ovsdbserver-sb\") pod \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.665092 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwwtd\" (UniqueName: \"kubernetes.io/projected/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-kube-api-access-hwwtd\") pod \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.665237 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-config\") pod \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.665845 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-dns-svc\") pod \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\" (UID: \"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1\") " Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.666024 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.666093 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.666122 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.666187 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d204094-c3c0-4f10-8668-731e258b54f6-scripts\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.666249 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d204094-c3c0-4f10-8668-731e258b54f6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.666265 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d204094-c3c0-4f10-8668-731e258b54f6-config\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.666287 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wv27\" (UniqueName: \"kubernetes.io/projected/1d204094-c3c0-4f10-8668-731e258b54f6-kube-api-access-4wv27\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.673587 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-kube-api-access-hwwtd" (OuterVolumeSpecName: "kube-api-access-hwwtd") pod "25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" (UID: "25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1"). InnerVolumeSpecName "kube-api-access-hwwtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.683483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-config" (OuterVolumeSpecName: "config") pod "25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" (UID: "25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.684107 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" (UID: "25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.684557 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" (UID: "25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.771891 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.771963 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772033 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d204094-c3c0-4f10-8668-731e258b54f6-scripts\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d204094-c3c0-4f10-8668-731e258b54f6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772170 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d204094-c3c0-4f10-8668-731e258b54f6-config\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772196 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wv27\" (UniqueName: \"kubernetes.io/projected/1d204094-c3c0-4f10-8668-731e258b54f6-kube-api-access-4wv27\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772243 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772298 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772308 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772316 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.772327 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwwtd\" (UniqueName: \"kubernetes.io/projected/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1-kube-api-access-hwwtd\") on node \"crc\" DevicePath \"\"" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.773440 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d204094-c3c0-4f10-8668-731e258b54f6-scripts\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.776575 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d204094-c3c0-4f10-8668-731e258b54f6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.778625 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.778703 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.778953 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d204094-c3c0-4f10-8668-731e258b54f6-config\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.782249 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d204094-c3c0-4f10-8668-731e258b54f6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.791317 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wv27\" (UniqueName: \"kubernetes.io/projected/1d204094-c3c0-4f10-8668-731e258b54f6-kube-api-access-4wv27\") pod \"ovn-northd-0\" (UID: \"1d204094-c3c0-4f10-8668-731e258b54f6\") " pod="openstack/ovn-northd-0" Oct 06 12:24:58 crc kubenswrapper[4892]: I1006 12:24:58.906899 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.203503 4892 generic.go:334] "Generic (PLEG): container finished" podID="83392b37-0087-4c7c-ab0d-91af0c170445" containerID="f6e35bb22f41354b2219d72a67a39722041f630b90a98e5e79da11808ed8e91a" exitCode=0 Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.203807 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerDied","Data":"f6e35bb22f41354b2219d72a67a39722041f630b90a98e5e79da11808ed8e91a"} Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.206589 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" event={"ID":"25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1","Type":"ContainerDied","Data":"9e1586fab1327b55f9cf90758ed5a2f414e9ce01e21b0816fe2b18e955008a9a"} Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.206641 4892 scope.go:117] "RemoveContainer" containerID="465578b6ba0ab7cbbb29afe08c89035814678c8db9aad5ac2e6dd137f8d479dd" Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.206770 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75f8dc6b75-9h4rs" Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.211795 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" event={"ID":"f5de5aa3-3022-4811-854d-6bd0446e5907","Type":"ContainerStarted","Data":"0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e"} Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.211829 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.253911 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" podStartSLOduration=3.253876276 podStartE2EDuration="3.253876276s" podCreationTimestamp="2025-10-06 12:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:24:59.251715631 +0000 UTC m=+985.801421396" watchObservedRunningTime="2025-10-06 12:24:59.253876276 +0000 UTC m=+985.803582041" Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.340005 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75f8dc6b75-9h4rs"] Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.350120 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 06 12:24:59 crc kubenswrapper[4892]: I1006 12:24:59.356909 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75f8dc6b75-9h4rs"] Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.183015 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" path="/var/lib/kubelet/pods/25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1/volumes" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.235251 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d204094-c3c0-4f10-8668-731e258b54f6","Type":"ContainerStarted","Data":"c6c3ca2643c35ba0916c365a22c7116742196378996796e08ac6df6bf97116e0"} Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.235295 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d204094-c3c0-4f10-8668-731e258b54f6","Type":"ContainerStarted","Data":"c8d6caa54d2f5968942d90edc16af59e476ab5d42cfae7bb890526e029569acc"} Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.235307 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"1d204094-c3c0-4f10-8668-731e258b54f6","Type":"ContainerStarted","Data":"efca4a427bdfa5f2ee3c3ec49fb05c9c14757158eb2cdb58b0e5cb75565cf550"} Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.256262 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.819053009 podStartE2EDuration="2.256244762s" podCreationTimestamp="2025-10-06 12:24:58 +0000 UTC" firstStartedPulling="2025-10-06 12:24:59.352453952 +0000 UTC m=+985.902159717" lastFinishedPulling="2025-10-06 12:24:59.789645705 +0000 UTC m=+986.339351470" observedRunningTime="2025-10-06 12:25:00.25384678 +0000 UTC m=+986.803552585" watchObservedRunningTime="2025-10-06 12:25:00.256244762 +0000 UTC m=+986.805950527" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.259742 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.334787 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc86d8fc-pp2hd"] Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.376436 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-576bc8c5c-zqcrg"] Oct 06 12:25:00 crc kubenswrapper[4892]: E1006 12:25:00.376814 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" containerName="init" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.376829 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" containerName="init" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.376995 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ed64ce-d4b3-4cd0-bc4f-a90faee5cea1" containerName="init" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.377857 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.384996 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-576bc8c5c-zqcrg"] Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.399034 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcq9j\" (UniqueName: \"kubernetes.io/projected/5c260537-1016-427a-a2b8-4a9046dbd3de-kube-api-access-rcq9j\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.399114 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-dns-svc\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.399167 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-nb\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.399211 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-config\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.399234 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-sb\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.500638 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-nb\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.500736 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-config\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.500772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-sb\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.500835 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcq9j\" (UniqueName: \"kubernetes.io/projected/5c260537-1016-427a-a2b8-4a9046dbd3de-kube-api-access-rcq9j\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.500908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-dns-svc\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.501587 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-nb\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.501627 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-config\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.501842 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-sb\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.501873 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-dns-svc\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.516382 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcq9j\" (UniqueName: \"kubernetes.io/projected/5c260537-1016-427a-a2b8-4a9046dbd3de-kube-api-access-rcq9j\") pod \"dnsmasq-dns-576bc8c5c-zqcrg\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:00 crc kubenswrapper[4892]: I1006 12:25:00.696369 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.154869 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-576bc8c5c-zqcrg"] Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.248972 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" event={"ID":"5c260537-1016-427a-a2b8-4a9046dbd3de","Type":"ContainerStarted","Data":"2637a77367972bed90787e337b012d5ee051ea9b6577c14ff9471dde64d50fdd"} Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.250434 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.249217 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" podUID="f5de5aa3-3022-4811-854d-6bd0446e5907" containerName="dnsmasq-dns" containerID="cri-o://0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e" gracePeriod=10 Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.435965 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.464783 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.464902 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.467640 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.468022 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.468193 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-sggqn" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.468225 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.518839 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.518884 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9f90d8be-05b7-4668-be7c-1494621a363b-lock\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.518941 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.518983 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8xp\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-kube-api-access-kd8xp\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.519008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f90d8be-05b7-4668-be7c-1494621a363b-cache\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.621035 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8xp\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-kube-api-access-kd8xp\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.621469 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f90d8be-05b7-4668-be7c-1494621a363b-cache\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.621557 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.621613 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9f90d8be-05b7-4668-be7c-1494621a363b-lock\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.621701 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: E1006 12:25:01.621962 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 12:25:01 crc kubenswrapper[4892]: E1006 12:25:01.621984 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 12:25:01 crc kubenswrapper[4892]: E1006 12:25:01.622233 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift podName:9f90d8be-05b7-4668-be7c-1494621a363b nodeName:}" failed. No retries permitted until 2025-10-06 12:25:02.122183957 +0000 UTC m=+988.671889722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift") pod "swift-storage-0" (UID: "9f90d8be-05b7-4668-be7c-1494621a363b") : configmap "swift-ring-files" not found Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.623819 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.624267 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/9f90d8be-05b7-4668-be7c-1494621a363b-lock\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.624355 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/9f90d8be-05b7-4668-be7c-1494621a363b-cache\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.650995 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8xp\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-kube-api-access-kd8xp\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.672705 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.729172 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.827788 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-config\") pod \"f5de5aa3-3022-4811-854d-6bd0446e5907\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.827850 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-sb\") pod \"f5de5aa3-3022-4811-854d-6bd0446e5907\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.827958 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkb84\" (UniqueName: \"kubernetes.io/projected/f5de5aa3-3022-4811-854d-6bd0446e5907-kube-api-access-bkb84\") pod \"f5de5aa3-3022-4811-854d-6bd0446e5907\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.827987 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-dns-svc\") pod \"f5de5aa3-3022-4811-854d-6bd0446e5907\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.828122 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-nb\") pod \"f5de5aa3-3022-4811-854d-6bd0446e5907\" (UID: \"f5de5aa3-3022-4811-854d-6bd0446e5907\") " Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.831536 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5de5aa3-3022-4811-854d-6bd0446e5907-kube-api-access-bkb84" (OuterVolumeSpecName: "kube-api-access-bkb84") pod "f5de5aa3-3022-4811-854d-6bd0446e5907" (UID: "f5de5aa3-3022-4811-854d-6bd0446e5907"). InnerVolumeSpecName "kube-api-access-bkb84". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.863151 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f5de5aa3-3022-4811-854d-6bd0446e5907" (UID: "f5de5aa3-3022-4811-854d-6bd0446e5907"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.864410 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5de5aa3-3022-4811-854d-6bd0446e5907" (UID: "f5de5aa3-3022-4811-854d-6bd0446e5907"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.868970 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-config" (OuterVolumeSpecName: "config") pod "f5de5aa3-3022-4811-854d-6bd0446e5907" (UID: "f5de5aa3-3022-4811-854d-6bd0446e5907"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.875448 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f5de5aa3-3022-4811-854d-6bd0446e5907" (UID: "f5de5aa3-3022-4811-854d-6bd0446e5907"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.930179 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.930220 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.930233 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.930289 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkb84\" (UniqueName: \"kubernetes.io/projected/f5de5aa3-3022-4811-854d-6bd0446e5907-kube-api-access-bkb84\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:01 crc kubenswrapper[4892]: I1006 12:25:01.930302 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5de5aa3-3022-4811-854d-6bd0446e5907-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:02 crc kubenswrapper[4892]: E1006 12:25:02.111541 4892 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.144:53308->38.102.83.144:40237: write tcp 38.102.83.144:53308->38.102.83.144:40237: write: broken pipe Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.135043 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:02 crc kubenswrapper[4892]: E1006 12:25:02.135272 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 12:25:02 crc kubenswrapper[4892]: E1006 12:25:02.135490 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 12:25:02 crc kubenswrapper[4892]: E1006 12:25:02.135615 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift podName:9f90d8be-05b7-4668-be7c-1494621a363b nodeName:}" failed. No retries permitted until 2025-10-06 12:25:03.135595213 +0000 UTC m=+989.685300978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift") pod "swift-storage-0" (UID: "9f90d8be-05b7-4668-be7c-1494621a363b") : configmap "swift-ring-files" not found Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.160704 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.259312 4892 generic.go:334] "Generic (PLEG): container finished" podID="f5de5aa3-3022-4811-854d-6bd0446e5907" containerID="0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e" exitCode=0 Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.259399 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" event={"ID":"f5de5aa3-3022-4811-854d-6bd0446e5907","Type":"ContainerDied","Data":"0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e"} Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.259425 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" event={"ID":"f5de5aa3-3022-4811-854d-6bd0446e5907","Type":"ContainerDied","Data":"425306a2936e05ee218db5eb626e470190d237bd11168c0e7296d99e3d06b197"} Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.259441 4892 scope.go:117] "RemoveContainer" containerID="0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e" Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.259540 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc86d8fc-pp2hd" Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.262564 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerID="41b8e69ae545d953b71d709ff8026dfb58f5b19f07baa3e9635013876c7ab0a2" exitCode=0 Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.262690 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" event={"ID":"5c260537-1016-427a-a2b8-4a9046dbd3de","Type":"ContainerDied","Data":"41b8e69ae545d953b71d709ff8026dfb58f5b19f07baa3e9635013876c7ab0a2"} Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.275544 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.281986 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc86d8fc-pp2hd"] Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.290009 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc86d8fc-pp2hd"] Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.313472 4892 scope.go:117] "RemoveContainer" containerID="16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a" Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.417062 4892 scope.go:117] "RemoveContainer" containerID="0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e" Oct 06 12:25:02 crc kubenswrapper[4892]: E1006 12:25:02.417545 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e\": container with ID starting with 0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e not found: ID does not exist" containerID="0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e" Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.417584 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e"} err="failed to get container status \"0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e\": rpc error: code = NotFound desc = could not find container \"0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e\": container with ID starting with 0d03877895bfa47971b0eff87d3a93ecf35910d5e73be05536983c97c2b8574e not found: ID does not exist" Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.417611 4892 scope.go:117] "RemoveContainer" containerID="16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a" Oct 06 12:25:02 crc kubenswrapper[4892]: E1006 12:25:02.418708 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a\": container with ID starting with 16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a not found: ID does not exist" containerID="16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a" Oct 06 12:25:02 crc kubenswrapper[4892]: I1006 12:25:02.418740 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a"} err="failed to get container status \"16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a\": rpc error: code = NotFound desc = could not find container \"16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a\": container with ID starting with 16b37dc3702c10f7f6cee19a21b24638ba39f913f4514442db94aca50ed8607a not found: ID does not exist" Oct 06 12:25:03 crc kubenswrapper[4892]: I1006 12:25:03.154668 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:03 crc kubenswrapper[4892]: E1006 12:25:03.154846 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 12:25:03 crc kubenswrapper[4892]: E1006 12:25:03.155423 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 12:25:03 crc kubenswrapper[4892]: E1006 12:25:03.155579 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift podName:9f90d8be-05b7-4668-be7c-1494621a363b nodeName:}" failed. No retries permitted until 2025-10-06 12:25:05.155485798 +0000 UTC m=+991.705191563 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift") pod "swift-storage-0" (UID: "9f90d8be-05b7-4668-be7c-1494621a363b") : configmap "swift-ring-files" not found Oct 06 12:25:03 crc kubenswrapper[4892]: I1006 12:25:03.278975 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" event={"ID":"5c260537-1016-427a-a2b8-4a9046dbd3de","Type":"ContainerStarted","Data":"2a9ee1a37261f8bc47efa6c831feb86237893b5061cba640c7269575eb601583"} Oct 06 12:25:03 crc kubenswrapper[4892]: I1006 12:25:03.279226 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:03 crc kubenswrapper[4892]: I1006 12:25:03.300596 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" podStartSLOduration=3.300581563 podStartE2EDuration="3.300581563s" podCreationTimestamp="2025-10-06 12:25:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:25:03.295119368 +0000 UTC m=+989.844825133" watchObservedRunningTime="2025-10-06 12:25:03.300581563 +0000 UTC m=+989.850287328" Oct 06 12:25:04 crc kubenswrapper[4892]: I1006 12:25:04.189745 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5de5aa3-3022-4811-854d-6bd0446e5907" path="/var/lib/kubelet/pods/f5de5aa3-3022-4811-854d-6bd0446e5907/volumes" Oct 06 12:25:04 crc kubenswrapper[4892]: I1006 12:25:04.296960 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 06 12:25:04 crc kubenswrapper[4892]: I1006 12:25:04.380366 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.187604 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:05 crc kubenswrapper[4892]: E1006 12:25:05.187814 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 12:25:05 crc kubenswrapper[4892]: E1006 12:25:05.187948 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 12:25:05 crc kubenswrapper[4892]: E1006 12:25:05.187997 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift podName:9f90d8be-05b7-4668-be7c-1494621a363b nodeName:}" failed. No retries permitted until 2025-10-06 12:25:09.187981886 +0000 UTC m=+995.737687651 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift") pod "swift-storage-0" (UID: "9f90d8be-05b7-4668-be7c-1494621a363b") : configmap "swift-ring-files" not found Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.442314 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rjntl"] Oct 06 12:25:05 crc kubenswrapper[4892]: E1006 12:25:05.442775 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5de5aa3-3022-4811-854d-6bd0446e5907" containerName="dnsmasq-dns" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.442790 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5de5aa3-3022-4811-854d-6bd0446e5907" containerName="dnsmasq-dns" Oct 06 12:25:05 crc kubenswrapper[4892]: E1006 12:25:05.442806 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5de5aa3-3022-4811-854d-6bd0446e5907" containerName="init" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.442812 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5de5aa3-3022-4811-854d-6bd0446e5907" containerName="init" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.443011 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5de5aa3-3022-4811-854d-6bd0446e5907" containerName="dnsmasq-dns" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.443581 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.448212 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.448860 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.456940 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.477946 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rjntl"] Oct 06 12:25:05 crc kubenswrapper[4892]: E1006 12:25:05.478873 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-jqgf6 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-jqgf6 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-rjntl" podUID="25a03858-3e32-49ae-b802-4066a30b4f59" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.483395 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-xjptv"] Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.486211 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.499166 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rjntl"] Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.507010 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xjptv"] Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.596754 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-ring-data-devices\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597051 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-ring-data-devices\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597091 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-combined-ca-bundle\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597154 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-scripts\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597196 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqgf6\" (UniqueName: \"kubernetes.io/projected/25a03858-3e32-49ae-b802-4066a30b4f59-kube-api-access-jqgf6\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597235 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/25a03858-3e32-49ae-b802-4066a30b4f59-etc-swift\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597260 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a045df0b-e5d4-4e68-b29f-47e270efa265-etc-swift\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597436 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-combined-ca-bundle\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597491 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-dispersionconf\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597601 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-scripts\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597636 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-dispersionconf\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597686 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-swiftconf\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597728 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-swiftconf\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.597838 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8rc\" (UniqueName: \"kubernetes.io/projected/a045df0b-e5d4-4e68-b29f-47e270efa265-kube-api-access-5h8rc\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.699643 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-scripts\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.699728 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqgf6\" (UniqueName: \"kubernetes.io/projected/25a03858-3e32-49ae-b802-4066a30b4f59-kube-api-access-jqgf6\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.699795 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/25a03858-3e32-49ae-b802-4066a30b4f59-etc-swift\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.699841 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a045df0b-e5d4-4e68-b29f-47e270efa265-etc-swift\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.699920 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-combined-ca-bundle\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.699957 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-dispersionconf\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700024 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-scripts\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700061 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-dispersionconf\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700109 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-swiftconf\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700155 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-swiftconf\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700208 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8rc\" (UniqueName: \"kubernetes.io/projected/a045df0b-e5d4-4e68-b29f-47e270efa265-kube-api-access-5h8rc\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700264 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-ring-data-devices\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700301 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-ring-data-devices\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700374 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-combined-ca-bundle\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.700980 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/25a03858-3e32-49ae-b802-4066a30b4f59-etc-swift\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.701271 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a045df0b-e5d4-4e68-b29f-47e270efa265-etc-swift\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.701542 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-ring-data-devices\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.701781 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-ring-data-devices\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.701960 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-scripts\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.702316 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-scripts\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.704966 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-swiftconf\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.707668 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-swiftconf\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.708120 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-combined-ca-bundle\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.708775 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-dispersionconf\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.710554 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-combined-ca-bundle\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.713579 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-dispersionconf\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.716865 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqgf6\" (UniqueName: \"kubernetes.io/projected/25a03858-3e32-49ae-b802-4066a30b4f59-kube-api-access-jqgf6\") pod \"swift-ring-rebalance-rjntl\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.724541 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8rc\" (UniqueName: \"kubernetes.io/projected/a045df0b-e5d4-4e68-b29f-47e270efa265-kube-api-access-5h8rc\") pod \"swift-ring-rebalance-xjptv\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:05 crc kubenswrapper[4892]: I1006 12:25:05.800780 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:06 crc kubenswrapper[4892]: W1006 12:25:06.271612 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda045df0b_e5d4_4e68_b29f_47e270efa265.slice/crio-11bbb872b71dfaea235175a44ec85f963232278b6a34f0537a25dc236c672320 WatchSource:0}: Error finding container 11bbb872b71dfaea235175a44ec85f963232278b6a34f0537a25dc236c672320: Status 404 returned error can't find the container with id 11bbb872b71dfaea235175a44ec85f963232278b6a34f0537a25dc236c672320 Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.277075 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-xjptv"] Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.303400 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xjptv" event={"ID":"a045df0b-e5d4-4e68-b29f-47e270efa265","Type":"ContainerStarted","Data":"11bbb872b71dfaea235175a44ec85f963232278b6a34f0537a25dc236c672320"} Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.306552 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.306547 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerStarted","Data":"165b75baeec76a8834f499781ad98fd3c24d3fd287af404da8cbe5e88305d1db"} Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.327268 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.415390 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-dispersionconf\") pod \"25a03858-3e32-49ae-b802-4066a30b4f59\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.415450 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-ring-data-devices\") pod \"25a03858-3e32-49ae-b802-4066a30b4f59\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.415510 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-scripts\") pod \"25a03858-3e32-49ae-b802-4066a30b4f59\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.415533 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-swiftconf\") pod \"25a03858-3e32-49ae-b802-4066a30b4f59\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.415563 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqgf6\" (UniqueName: \"kubernetes.io/projected/25a03858-3e32-49ae-b802-4066a30b4f59-kube-api-access-jqgf6\") pod \"25a03858-3e32-49ae-b802-4066a30b4f59\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.415595 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/25a03858-3e32-49ae-b802-4066a30b4f59-etc-swift\") pod \"25a03858-3e32-49ae-b802-4066a30b4f59\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.415626 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-combined-ca-bundle\") pod \"25a03858-3e32-49ae-b802-4066a30b4f59\" (UID: \"25a03858-3e32-49ae-b802-4066a30b4f59\") " Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.416235 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25a03858-3e32-49ae-b802-4066a30b4f59-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "25a03858-3e32-49ae-b802-4066a30b4f59" (UID: "25a03858-3e32-49ae-b802-4066a30b4f59"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.416264 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "25a03858-3e32-49ae-b802-4066a30b4f59" (UID: "25a03858-3e32-49ae-b802-4066a30b4f59"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.416552 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-scripts" (OuterVolumeSpecName: "scripts") pod "25a03858-3e32-49ae-b802-4066a30b4f59" (UID: "25a03858-3e32-49ae-b802-4066a30b4f59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.417218 4892 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/25a03858-3e32-49ae-b802-4066a30b4f59-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.417265 4892 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.417281 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/25a03858-3e32-49ae-b802-4066a30b4f59-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.422194 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "25a03858-3e32-49ae-b802-4066a30b4f59" (UID: "25a03858-3e32-49ae-b802-4066a30b4f59"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.422750 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "25a03858-3e32-49ae-b802-4066a30b4f59" (UID: "25a03858-3e32-49ae-b802-4066a30b4f59"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.423026 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a03858-3e32-49ae-b802-4066a30b4f59-kube-api-access-jqgf6" (OuterVolumeSpecName: "kube-api-access-jqgf6") pod "25a03858-3e32-49ae-b802-4066a30b4f59" (UID: "25a03858-3e32-49ae-b802-4066a30b4f59"). InnerVolumeSpecName "kube-api-access-jqgf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.423834 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25a03858-3e32-49ae-b802-4066a30b4f59" (UID: "25a03858-3e32-49ae-b802-4066a30b4f59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.519032 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.519065 4892 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.519074 4892 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/25a03858-3e32-49ae-b802-4066a30b4f59-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:06 crc kubenswrapper[4892]: I1006 12:25:06.519084 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqgf6\" (UniqueName: \"kubernetes.io/projected/25a03858-3e32-49ae-b802-4066a30b4f59-kube-api-access-jqgf6\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:07 crc kubenswrapper[4892]: I1006 12:25:07.313868 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rjntl" Oct 06 12:25:07 crc kubenswrapper[4892]: I1006 12:25:07.393846 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rjntl"] Oct 06 12:25:07 crc kubenswrapper[4892]: I1006 12:25:07.400292 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-rjntl"] Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.187948 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a03858-3e32-49ae-b802-4066a30b4f59" path="/var/lib/kubelet/pods/25a03858-3e32-49ae-b802-4066a30b4f59/volumes" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.225166 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-t6bmr"] Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.226652 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t6bmr" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.233695 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-t6bmr"] Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.354589 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn6t6\" (UniqueName: \"kubernetes.io/projected/a660a565-71b7-4fd3-8864-f633a0dc1240-kube-api-access-fn6t6\") pod \"keystone-db-create-t6bmr\" (UID: \"a660a565-71b7-4fd3-8864-f633a0dc1240\") " pod="openstack/keystone-db-create-t6bmr" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.419727 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vgr8m"] Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.426268 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vgr8m" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.435629 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vgr8m"] Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.456466 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn6t6\" (UniqueName: \"kubernetes.io/projected/a660a565-71b7-4fd3-8864-f633a0dc1240-kube-api-access-fn6t6\") pod \"keystone-db-create-t6bmr\" (UID: \"a660a565-71b7-4fd3-8864-f633a0dc1240\") " pod="openstack/keystone-db-create-t6bmr" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.487456 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn6t6\" (UniqueName: \"kubernetes.io/projected/a660a565-71b7-4fd3-8864-f633a0dc1240-kube-api-access-fn6t6\") pod \"keystone-db-create-t6bmr\" (UID: \"a660a565-71b7-4fd3-8864-f633a0dc1240\") " pod="openstack/keystone-db-create-t6bmr" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.557939 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvm8d\" (UniqueName: \"kubernetes.io/projected/7a5823de-6b73-4608-b37e-031dc44dc68b-kube-api-access-kvm8d\") pod \"placement-db-create-vgr8m\" (UID: \"7a5823de-6b73-4608-b37e-031dc44dc68b\") " pod="openstack/placement-db-create-vgr8m" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.559463 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t6bmr" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.659594 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvm8d\" (UniqueName: \"kubernetes.io/projected/7a5823de-6b73-4608-b37e-031dc44dc68b-kube-api-access-kvm8d\") pod \"placement-db-create-vgr8m\" (UID: \"7a5823de-6b73-4608-b37e-031dc44dc68b\") " pod="openstack/placement-db-create-vgr8m" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.663728 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pnfbr"] Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.664801 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pnfbr" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.676155 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pnfbr"] Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.679119 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvm8d\" (UniqueName: \"kubernetes.io/projected/7a5823de-6b73-4608-b37e-031dc44dc68b-kube-api-access-kvm8d\") pod \"placement-db-create-vgr8m\" (UID: \"7a5823de-6b73-4608-b37e-031dc44dc68b\") " pod="openstack/placement-db-create-vgr8m" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.750001 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vgr8m" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.762271 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4bbj\" (UniqueName: \"kubernetes.io/projected/30982372-73ba-48f1-b3b3-541d8c51d6ce-kube-api-access-h4bbj\") pod \"glance-db-create-pnfbr\" (UID: \"30982372-73ba-48f1-b3b3-541d8c51d6ce\") " pod="openstack/glance-db-create-pnfbr" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.864290 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4bbj\" (UniqueName: \"kubernetes.io/projected/30982372-73ba-48f1-b3b3-541d8c51d6ce-kube-api-access-h4bbj\") pod \"glance-db-create-pnfbr\" (UID: \"30982372-73ba-48f1-b3b3-541d8c51d6ce\") " pod="openstack/glance-db-create-pnfbr" Oct 06 12:25:08 crc kubenswrapper[4892]: I1006 12:25:08.895181 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4bbj\" (UniqueName: \"kubernetes.io/projected/30982372-73ba-48f1-b3b3-541d8c51d6ce-kube-api-access-h4bbj\") pod \"glance-db-create-pnfbr\" (UID: \"30982372-73ba-48f1-b3b3-541d8c51d6ce\") " pod="openstack/glance-db-create-pnfbr" Oct 06 12:25:09 crc kubenswrapper[4892]: I1006 12:25:09.031560 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pnfbr" Oct 06 12:25:09 crc kubenswrapper[4892]: I1006 12:25:09.270591 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:09 crc kubenswrapper[4892]: E1006 12:25:09.270797 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 12:25:09 crc kubenswrapper[4892]: E1006 12:25:09.270817 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 12:25:09 crc kubenswrapper[4892]: E1006 12:25:09.270898 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift podName:9f90d8be-05b7-4668-be7c-1494621a363b nodeName:}" failed. No retries permitted until 2025-10-06 12:25:17.270880392 +0000 UTC m=+1003.820586167 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift") pod "swift-storage-0" (UID: "9f90d8be-05b7-4668-be7c-1494621a363b") : configmap "swift-ring-files" not found Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.418943 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-mzrvr"] Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.421398 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mzrvr" Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.433979 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-mzrvr"] Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.494812 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hn8r\" (UniqueName: \"kubernetes.io/projected/b9aee1c6-1d4d-4fd4-9aee-2760312e0e63-kube-api-access-9hn8r\") pod \"watcher-db-create-mzrvr\" (UID: \"b9aee1c6-1d4d-4fd4-9aee-2760312e0e63\") " pod="openstack/watcher-db-create-mzrvr" Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.596994 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hn8r\" (UniqueName: \"kubernetes.io/projected/b9aee1c6-1d4d-4fd4-9aee-2760312e0e63-kube-api-access-9hn8r\") pod \"watcher-db-create-mzrvr\" (UID: \"b9aee1c6-1d4d-4fd4-9aee-2760312e0e63\") " pod="openstack/watcher-db-create-mzrvr" Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.636976 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hn8r\" (UniqueName: \"kubernetes.io/projected/b9aee1c6-1d4d-4fd4-9aee-2760312e0e63-kube-api-access-9hn8r\") pod \"watcher-db-create-mzrvr\" (UID: \"b9aee1c6-1d4d-4fd4-9aee-2760312e0e63\") " pod="openstack/watcher-db-create-mzrvr" Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.697760 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.766925 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mzrvr" Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.794129 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc64b8dc7-tlr89"] Oct 06 12:25:10 crc kubenswrapper[4892]: I1006 12:25:10.794453 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" podUID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" containerName="dnsmasq-dns" containerID="cri-o://366cb00b880fa452881ec9da475e87b0ff3f38cd147a18bd615f300578c98171" gracePeriod=10 Oct 06 12:25:11 crc kubenswrapper[4892]: I1006 12:25:11.361965 4892 generic.go:334] "Generic (PLEG): container finished" podID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" containerID="366cb00b880fa452881ec9da475e87b0ff3f38cd147a18bd615f300578c98171" exitCode=0 Oct 06 12:25:11 crc kubenswrapper[4892]: I1006 12:25:11.362050 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" event={"ID":"fa73d237-4fcb-45b6-b394-fb9295df0e2d","Type":"ContainerDied","Data":"366cb00b880fa452881ec9da475e87b0ff3f38cd147a18bd615f300578c98171"} Oct 06 12:25:11 crc kubenswrapper[4892]: I1006 12:25:11.364558 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerStarted","Data":"dd0c8389a544aca5952a2124e2ec0fbe7acc84d0f49e1af11649de19467053dc"} Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.042099 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.143027 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-config\") pod \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.143176 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjzkr\" (UniqueName: \"kubernetes.io/projected/fa73d237-4fcb-45b6-b394-fb9295df0e2d-kube-api-access-bjzkr\") pod \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.143217 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-dns-svc\") pod \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\" (UID: \"fa73d237-4fcb-45b6-b394-fb9295df0e2d\") " Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.147853 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa73d237-4fcb-45b6-b394-fb9295df0e2d-kube-api-access-bjzkr" (OuterVolumeSpecName: "kube-api-access-bjzkr") pod "fa73d237-4fcb-45b6-b394-fb9295df0e2d" (UID: "fa73d237-4fcb-45b6-b394-fb9295df0e2d"). InnerVolumeSpecName "kube-api-access-bjzkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.195378 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-config" (OuterVolumeSpecName: "config") pod "fa73d237-4fcb-45b6-b394-fb9295df0e2d" (UID: "fa73d237-4fcb-45b6-b394-fb9295df0e2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.196199 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa73d237-4fcb-45b6-b394-fb9295df0e2d" (UID: "fa73d237-4fcb-45b6-b394-fb9295df0e2d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.245640 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjzkr\" (UniqueName: \"kubernetes.io/projected/fa73d237-4fcb-45b6-b394-fb9295df0e2d-kube-api-access-bjzkr\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.245785 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.245805 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa73d237-4fcb-45b6-b394-fb9295df0e2d-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.313073 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-mzrvr"] Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.319680 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vgr8m"] Oct 06 12:25:13 crc kubenswrapper[4892]: W1006 12:25:13.322126 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9aee1c6_1d4d_4fd4_9aee_2760312e0e63.slice/crio-fdfede37c410c2b70f0c6bcb32531797345a5a9f0c9e9d062f0ec767a0b855f1 WatchSource:0}: Error finding container fdfede37c410c2b70f0c6bcb32531797345a5a9f0c9e9d062f0ec767a0b855f1: Status 404 returned error can't find the container with id fdfede37c410c2b70f0c6bcb32531797345a5a9f0c9e9d062f0ec767a0b855f1 Oct 06 12:25:13 crc kubenswrapper[4892]: W1006 12:25:13.333661 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a5823de_6b73_4608_b37e_031dc44dc68b.slice/crio-0f6104cfbe124f3409d5c1cf4c6d119593dd90a8948511263fb59eacc534e714 WatchSource:0}: Error finding container 0f6104cfbe124f3409d5c1cf4c6d119593dd90a8948511263fb59eacc534e714: Status 404 returned error can't find the container with id 0f6104cfbe124f3409d5c1cf4c6d119593dd90a8948511263fb59eacc534e714 Oct 06 12:25:13 crc kubenswrapper[4892]: W1006 12:25:13.335600 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30982372_73ba_48f1_b3b3_541d8c51d6ce.slice/crio-2ef227c5572040131311fee1f000dfe245067b8c46c3e7302546c46d88f3777c WatchSource:0}: Error finding container 2ef227c5572040131311fee1f000dfe245067b8c46c3e7302546c46d88f3777c: Status 404 returned error can't find the container with id 2ef227c5572040131311fee1f000dfe245067b8c46c3e7302546c46d88f3777c Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.338777 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pnfbr"] Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.380358 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.380401 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc64b8dc7-tlr89" event={"ID":"fa73d237-4fcb-45b6-b394-fb9295df0e2d","Type":"ContainerDied","Data":"5a72b6c67a6bc90d156ffffcd8b5622643fcbed71d393266521527760e8e7d1b"} Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.380467 4892 scope.go:117] "RemoveContainer" containerID="366cb00b880fa452881ec9da475e87b0ff3f38cd147a18bd615f300578c98171" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.389979 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mzrvr" event={"ID":"b9aee1c6-1d4d-4fd4-9aee-2760312e0e63","Type":"ContainerStarted","Data":"fdfede37c410c2b70f0c6bcb32531797345a5a9f0c9e9d062f0ec767a0b855f1"} Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.391554 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vgr8m" event={"ID":"7a5823de-6b73-4608-b37e-031dc44dc68b","Type":"ContainerStarted","Data":"0f6104cfbe124f3409d5c1cf4c6d119593dd90a8948511263fb59eacc534e714"} Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.393235 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xjptv" event={"ID":"a045df0b-e5d4-4e68-b29f-47e270efa265","Type":"ContainerStarted","Data":"9182442ebe4658bf4d0a26673d2a5939a8f41c2cb621071b264750480c56e1a5"} Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.395370 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pnfbr" event={"ID":"30982372-73ba-48f1-b3b3-541d8c51d6ce","Type":"ContainerStarted","Data":"2ef227c5572040131311fee1f000dfe245067b8c46c3e7302546c46d88f3777c"} Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.412818 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-xjptv" podStartSLOduration=1.857867686 podStartE2EDuration="8.412798634s" podCreationTimestamp="2025-10-06 12:25:05 +0000 UTC" firstStartedPulling="2025-10-06 12:25:06.273867426 +0000 UTC m=+992.823573201" lastFinishedPulling="2025-10-06 12:25:12.828798374 +0000 UTC m=+999.378504149" observedRunningTime="2025-10-06 12:25:13.407448913 +0000 UTC m=+999.957154678" watchObservedRunningTime="2025-10-06 12:25:13.412798634 +0000 UTC m=+999.962504409" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.432393 4892 scope.go:117] "RemoveContainer" containerID="904dc31b2872529922d0f3c005434d08c258c066009006805ceb1359ad703f0b" Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.436132 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc64b8dc7-tlr89"] Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.442316 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc64b8dc7-tlr89"] Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.461658 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-t6bmr"] Oct 06 12:25:13 crc kubenswrapper[4892]: W1006 12:25:13.467873 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda660a565_71b7_4fd3_8864_f633a0dc1240.slice/crio-dc32fa53fa8db24c02f2d2be88a045d84e47114f10b03e688be0143e9b785222 WatchSource:0}: Error finding container dc32fa53fa8db24c02f2d2be88a045d84e47114f10b03e688be0143e9b785222: Status 404 returned error can't find the container with id dc32fa53fa8db24c02f2d2be88a045d84e47114f10b03e688be0143e9b785222 Oct 06 12:25:13 crc kubenswrapper[4892]: I1006 12:25:13.967686 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.186357 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" path="/var/lib/kubelet/pods/fa73d237-4fcb-45b6-b394-fb9295df0e2d/volumes" Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.411411 4892 generic.go:334] "Generic (PLEG): container finished" podID="7a5823de-6b73-4608-b37e-031dc44dc68b" containerID="fcc43a7ac582e3c6a72e7f49c95e35bc0a187b6b8538f5287799dc1c637b80eb" exitCode=0 Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.411585 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vgr8m" event={"ID":"7a5823de-6b73-4608-b37e-031dc44dc68b","Type":"ContainerDied","Data":"fcc43a7ac582e3c6a72e7f49c95e35bc0a187b6b8538f5287799dc1c637b80eb"} Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.415593 4892 generic.go:334] "Generic (PLEG): container finished" podID="30982372-73ba-48f1-b3b3-541d8c51d6ce" containerID="8fb2675980b207eb06aaf32bef2c8758e2bb924e25fbbe8c6309f61234d58b06" exitCode=0 Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.415665 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pnfbr" event={"ID":"30982372-73ba-48f1-b3b3-541d8c51d6ce","Type":"ContainerDied","Data":"8fb2675980b207eb06aaf32bef2c8758e2bb924e25fbbe8c6309f61234d58b06"} Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.420788 4892 generic.go:334] "Generic (PLEG): container finished" podID="a660a565-71b7-4fd3-8864-f633a0dc1240" containerID="85839fc4bc2b4586e150263513f7c8b8240dfc8efd2339a44c3cf4be37e39d0c" exitCode=0 Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.420875 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t6bmr" event={"ID":"a660a565-71b7-4fd3-8864-f633a0dc1240","Type":"ContainerDied","Data":"85839fc4bc2b4586e150263513f7c8b8240dfc8efd2339a44c3cf4be37e39d0c"} Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.420915 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t6bmr" event={"ID":"a660a565-71b7-4fd3-8864-f633a0dc1240","Type":"ContainerStarted","Data":"dc32fa53fa8db24c02f2d2be88a045d84e47114f10b03e688be0143e9b785222"} Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.430000 4892 generic.go:334] "Generic (PLEG): container finished" podID="b9aee1c6-1d4d-4fd4-9aee-2760312e0e63" containerID="de7a2f342856eed731d9c75acc4d717f3d0fb8dfc6be121cf100757515c9ba79" exitCode=0 Oct 06 12:25:14 crc kubenswrapper[4892]: I1006 12:25:14.430378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mzrvr" event={"ID":"b9aee1c6-1d4d-4fd4-9aee-2760312e0e63","Type":"ContainerDied","Data":"de7a2f342856eed731d9c75acc4d717f3d0fb8dfc6be121cf100757515c9ba79"} Oct 06 12:25:15 crc kubenswrapper[4892]: I1006 12:25:15.455647 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerStarted","Data":"f692fc4783157738a25bd255555118a834fe52a60c84f1db96156f50891b5ea4"} Oct 06 12:25:15 crc kubenswrapper[4892]: I1006 12:25:15.529148 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=9.499766173 podStartE2EDuration="45.529127855s" podCreationTimestamp="2025-10-06 12:24:30 +0000 UTC" firstStartedPulling="2025-10-06 12:24:39.081266242 +0000 UTC m=+965.630972007" lastFinishedPulling="2025-10-06 12:25:15.110627934 +0000 UTC m=+1001.660333689" observedRunningTime="2025-10-06 12:25:15.506238106 +0000 UTC m=+1002.055943881" watchObservedRunningTime="2025-10-06 12:25:15.529127855 +0000 UTC m=+1002.078833640" Oct 06 12:25:15 crc kubenswrapper[4892]: I1006 12:25:15.887913 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t6bmr" Oct 06 12:25:15 crc kubenswrapper[4892]: I1006 12:25:15.904699 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn6t6\" (UniqueName: \"kubernetes.io/projected/a660a565-71b7-4fd3-8864-f633a0dc1240-kube-api-access-fn6t6\") pod \"a660a565-71b7-4fd3-8864-f633a0dc1240\" (UID: \"a660a565-71b7-4fd3-8864-f633a0dc1240\") " Oct 06 12:25:15 crc kubenswrapper[4892]: I1006 12:25:15.917067 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a660a565-71b7-4fd3-8864-f633a0dc1240-kube-api-access-fn6t6" (OuterVolumeSpecName: "kube-api-access-fn6t6") pod "a660a565-71b7-4fd3-8864-f633a0dc1240" (UID: "a660a565-71b7-4fd3-8864-f633a0dc1240"). InnerVolumeSpecName "kube-api-access-fn6t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.006683 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn6t6\" (UniqueName: \"kubernetes.io/projected/a660a565-71b7-4fd3-8864-f633a0dc1240-kube-api-access-fn6t6\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.008153 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mzrvr" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.016766 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vgr8m" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.028836 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pnfbr" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.107614 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hn8r\" (UniqueName: \"kubernetes.io/projected/b9aee1c6-1d4d-4fd4-9aee-2760312e0e63-kube-api-access-9hn8r\") pod \"b9aee1c6-1d4d-4fd4-9aee-2760312e0e63\" (UID: \"b9aee1c6-1d4d-4fd4-9aee-2760312e0e63\") " Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.107769 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4bbj\" (UniqueName: \"kubernetes.io/projected/30982372-73ba-48f1-b3b3-541d8c51d6ce-kube-api-access-h4bbj\") pod \"30982372-73ba-48f1-b3b3-541d8c51d6ce\" (UID: \"30982372-73ba-48f1-b3b3-541d8c51d6ce\") " Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.107885 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvm8d\" (UniqueName: \"kubernetes.io/projected/7a5823de-6b73-4608-b37e-031dc44dc68b-kube-api-access-kvm8d\") pod \"7a5823de-6b73-4608-b37e-031dc44dc68b\" (UID: \"7a5823de-6b73-4608-b37e-031dc44dc68b\") " Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.110390 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9aee1c6-1d4d-4fd4-9aee-2760312e0e63-kube-api-access-9hn8r" (OuterVolumeSpecName: "kube-api-access-9hn8r") pod "b9aee1c6-1d4d-4fd4-9aee-2760312e0e63" (UID: "b9aee1c6-1d4d-4fd4-9aee-2760312e0e63"). InnerVolumeSpecName "kube-api-access-9hn8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.111406 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30982372-73ba-48f1-b3b3-541d8c51d6ce-kube-api-access-h4bbj" (OuterVolumeSpecName: "kube-api-access-h4bbj") pod "30982372-73ba-48f1-b3b3-541d8c51d6ce" (UID: "30982372-73ba-48f1-b3b3-541d8c51d6ce"). InnerVolumeSpecName "kube-api-access-h4bbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.112927 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5823de-6b73-4608-b37e-031dc44dc68b-kube-api-access-kvm8d" (OuterVolumeSpecName: "kube-api-access-kvm8d") pod "7a5823de-6b73-4608-b37e-031dc44dc68b" (UID: "7a5823de-6b73-4608-b37e-031dc44dc68b"). InnerVolumeSpecName "kube-api-access-kvm8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.210098 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvm8d\" (UniqueName: \"kubernetes.io/projected/7a5823de-6b73-4608-b37e-031dc44dc68b-kube-api-access-kvm8d\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.210411 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hn8r\" (UniqueName: \"kubernetes.io/projected/b9aee1c6-1d4d-4fd4-9aee-2760312e0e63-kube-api-access-9hn8r\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.210422 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4bbj\" (UniqueName: \"kubernetes.io/projected/30982372-73ba-48f1-b3b3-541d8c51d6ce-kube-api-access-h4bbj\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.467375 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vgr8m" event={"ID":"7a5823de-6b73-4608-b37e-031dc44dc68b","Type":"ContainerDied","Data":"0f6104cfbe124f3409d5c1cf4c6d119593dd90a8948511263fb59eacc534e714"} Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.467436 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f6104cfbe124f3409d5c1cf4c6d119593dd90a8948511263fb59eacc534e714" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.467519 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vgr8m" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.471367 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pnfbr" event={"ID":"30982372-73ba-48f1-b3b3-541d8c51d6ce","Type":"ContainerDied","Data":"2ef227c5572040131311fee1f000dfe245067b8c46c3e7302546c46d88f3777c"} Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.471425 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ef227c5572040131311fee1f000dfe245067b8c46c3e7302546c46d88f3777c" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.471443 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pnfbr" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.474795 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-t6bmr" event={"ID":"a660a565-71b7-4fd3-8864-f633a0dc1240","Type":"ContainerDied","Data":"dc32fa53fa8db24c02f2d2be88a045d84e47114f10b03e688be0143e9b785222"} Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.474859 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc32fa53fa8db24c02f2d2be88a045d84e47114f10b03e688be0143e9b785222" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.474860 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-t6bmr" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.477908 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mzrvr" event={"ID":"b9aee1c6-1d4d-4fd4-9aee-2760312e0e63","Type":"ContainerDied","Data":"fdfede37c410c2b70f0c6bcb32531797345a5a9f0c9e9d062f0ec767a0b855f1"} Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.477998 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdfede37c410c2b70f0c6bcb32531797345a5a9f0c9e9d062f0ec767a0b855f1" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.478286 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mzrvr" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.588805 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.588962 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:16 crc kubenswrapper[4892]: I1006 12:25:16.594970 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:17 crc kubenswrapper[4892]: I1006 12:25:17.333438 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:17 crc kubenswrapper[4892]: E1006 12:25:17.333706 4892 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 06 12:25:17 crc kubenswrapper[4892]: E1006 12:25:17.333747 4892 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 06 12:25:17 crc kubenswrapper[4892]: E1006 12:25:17.333840 4892 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift podName:9f90d8be-05b7-4668-be7c-1494621a363b nodeName:}" failed. No retries permitted until 2025-10-06 12:25:33.33381206 +0000 UTC m=+1019.883517865 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift") pod "swift-storage-0" (UID: "9f90d8be-05b7-4668-be7c-1494621a363b") : configmap "swift-ring-files" not found Oct 06 12:25:17 crc kubenswrapper[4892]: I1006 12:25:17.492239 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:20 crc kubenswrapper[4892]: I1006 12:25:20.520777 4892 generic.go:334] "Generic (PLEG): container finished" podID="a045df0b-e5d4-4e68-b29f-47e270efa265" containerID="9182442ebe4658bf4d0a26673d2a5939a8f41c2cb621071b264750480c56e1a5" exitCode=0 Oct 06 12:25:20 crc kubenswrapper[4892]: I1006 12:25:20.520900 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xjptv" event={"ID":"a045df0b-e5d4-4e68-b29f-47e270efa265","Type":"ContainerDied","Data":"9182442ebe4658bf4d0a26673d2a5939a8f41c2cb621071b264750480c56e1a5"} Oct 06 12:25:20 crc kubenswrapper[4892]: I1006 12:25:20.536153 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:25:20 crc kubenswrapper[4892]: I1006 12:25:20.536583 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="prometheus" containerID="cri-o://165b75baeec76a8834f499781ad98fd3c24d3fd287af404da8cbe5e88305d1db" gracePeriod=600 Oct 06 12:25:20 crc kubenswrapper[4892]: I1006 12:25:20.536772 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="thanos-sidecar" containerID="cri-o://f692fc4783157738a25bd255555118a834fe52a60c84f1db96156f50891b5ea4" gracePeriod=600 Oct 06 12:25:20 crc kubenswrapper[4892]: I1006 12:25:20.536839 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="config-reloader" containerID="cri-o://dd0c8389a544aca5952a2124e2ec0fbe7acc84d0f49e1af11649de19467053dc" gracePeriod=600 Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.532557 4892 generic.go:334] "Generic (PLEG): container finished" podID="83392b37-0087-4c7c-ab0d-91af0c170445" containerID="f692fc4783157738a25bd255555118a834fe52a60c84f1db96156f50891b5ea4" exitCode=0 Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.532840 4892 generic.go:334] "Generic (PLEG): container finished" podID="83392b37-0087-4c7c-ab0d-91af0c170445" containerID="dd0c8389a544aca5952a2124e2ec0fbe7acc84d0f49e1af11649de19467053dc" exitCode=0 Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.532853 4892 generic.go:334] "Generic (PLEG): container finished" podID="83392b37-0087-4c7c-ab0d-91af0c170445" containerID="165b75baeec76a8834f499781ad98fd3c24d3fd287af404da8cbe5e88305d1db" exitCode=0 Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.532745 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerDied","Data":"f692fc4783157738a25bd255555118a834fe52a60c84f1db96156f50891b5ea4"} Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.533054 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerDied","Data":"dd0c8389a544aca5952a2124e2ec0fbe7acc84d0f49e1af11649de19467053dc"} Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.533073 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerDied","Data":"165b75baeec76a8834f499781ad98fd3c24d3fd287af404da8cbe5e88305d1db"} Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.533086 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"83392b37-0087-4c7c-ab0d-91af0c170445","Type":"ContainerDied","Data":"41fc519563b0c3caf05d2714c5f2ac6d14fc35023fc50691ed951d17d696f052"} Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.533100 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41fc519563b0c3caf05d2714c5f2ac6d14fc35023fc50691ed951d17d696f052" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.537975 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.604140 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-tls-assets\") pod \"83392b37-0087-4c7c-ab0d-91af0c170445\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.604234 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83392b37-0087-4c7c-ab0d-91af0c170445-prometheus-metric-storage-rulefiles-0\") pod \"83392b37-0087-4c7c-ab0d-91af0c170445\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.604281 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-web-config\") pod \"83392b37-0087-4c7c-ab0d-91af0c170445\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.604435 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r562t\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-kube-api-access-r562t\") pod \"83392b37-0087-4c7c-ab0d-91af0c170445\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.604518 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-thanos-prometheus-http-client-file\") pod \"83392b37-0087-4c7c-ab0d-91af0c170445\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.604540 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-config\") pod \"83392b37-0087-4c7c-ab0d-91af0c170445\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.604564 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83392b37-0087-4c7c-ab0d-91af0c170445-config-out\") pod \"83392b37-0087-4c7c-ab0d-91af0c170445\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.605560 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83392b37-0087-4c7c-ab0d-91af0c170445-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "83392b37-0087-4c7c-ab0d-91af0c170445" (UID: "83392b37-0087-4c7c-ab0d-91af0c170445"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.605624 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"83392b37-0087-4c7c-ab0d-91af0c170445\" (UID: \"83392b37-0087-4c7c-ab0d-91af0c170445\") " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.606055 4892 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/83392b37-0087-4c7c-ab0d-91af0c170445-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.611311 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "83392b37-0087-4c7c-ab0d-91af0c170445" (UID: "83392b37-0087-4c7c-ab0d-91af0c170445"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.611799 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-config" (OuterVolumeSpecName: "config") pod "83392b37-0087-4c7c-ab0d-91af0c170445" (UID: "83392b37-0087-4c7c-ab0d-91af0c170445"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.612728 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-kube-api-access-r562t" (OuterVolumeSpecName: "kube-api-access-r562t") pod "83392b37-0087-4c7c-ab0d-91af0c170445" (UID: "83392b37-0087-4c7c-ab0d-91af0c170445"). InnerVolumeSpecName "kube-api-access-r562t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.615022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83392b37-0087-4c7c-ab0d-91af0c170445-config-out" (OuterVolumeSpecName: "config-out") pod "83392b37-0087-4c7c-ab0d-91af0c170445" (UID: "83392b37-0087-4c7c-ab0d-91af0c170445"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.615647 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "83392b37-0087-4c7c-ab0d-91af0c170445" (UID: "83392b37-0087-4c7c-ab0d-91af0c170445"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.639725 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "83392b37-0087-4c7c-ab0d-91af0c170445" (UID: "83392b37-0087-4c7c-ab0d-91af0c170445"). InnerVolumeSpecName "pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.645432 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-web-config" (OuterVolumeSpecName: "web-config") pod "83392b37-0087-4c7c-ab0d-91af0c170445" (UID: "83392b37-0087-4c7c-ab0d-91af0c170445"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.708940 4892 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.708997 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.709021 4892 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/83392b37-0087-4c7c-ab0d-91af0c170445-config-out\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.709084 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") on node \"crc\" " Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.709117 4892 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.709146 4892 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/83392b37-0087-4c7c-ab0d-91af0c170445-web-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.709210 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r562t\" (UniqueName: \"kubernetes.io/projected/83392b37-0087-4c7c-ab0d-91af0c170445-kube-api-access-r562t\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.738731 4892 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.738885 4892 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa") on node "crc" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.810367 4892 reconciler_common.go:293] "Volume detached for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:21 crc kubenswrapper[4892]: I1006 12:25:21.920738 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.013519 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h8rc\" (UniqueName: \"kubernetes.io/projected/a045df0b-e5d4-4e68-b29f-47e270efa265-kube-api-access-5h8rc\") pod \"a045df0b-e5d4-4e68-b29f-47e270efa265\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.013690 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-combined-ca-bundle\") pod \"a045df0b-e5d4-4e68-b29f-47e270efa265\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.013739 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-scripts\") pod \"a045df0b-e5d4-4e68-b29f-47e270efa265\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.013835 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-dispersionconf\") pod \"a045df0b-e5d4-4e68-b29f-47e270efa265\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.014383 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-swiftconf\") pod \"a045df0b-e5d4-4e68-b29f-47e270efa265\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.014464 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-ring-data-devices\") pod \"a045df0b-e5d4-4e68-b29f-47e270efa265\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.014504 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a045df0b-e5d4-4e68-b29f-47e270efa265-etc-swift\") pod \"a045df0b-e5d4-4e68-b29f-47e270efa265\" (UID: \"a045df0b-e5d4-4e68-b29f-47e270efa265\") " Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.015509 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a045df0b-e5d4-4e68-b29f-47e270efa265" (UID: "a045df0b-e5d4-4e68-b29f-47e270efa265"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.015528 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a045df0b-e5d4-4e68-b29f-47e270efa265-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a045df0b-e5d4-4e68-b29f-47e270efa265" (UID: "a045df0b-e5d4-4e68-b29f-47e270efa265"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.019538 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a045df0b-e5d4-4e68-b29f-47e270efa265-kube-api-access-5h8rc" (OuterVolumeSpecName: "kube-api-access-5h8rc") pod "a045df0b-e5d4-4e68-b29f-47e270efa265" (UID: "a045df0b-e5d4-4e68-b29f-47e270efa265"). InnerVolumeSpecName "kube-api-access-5h8rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.023176 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a045df0b-e5d4-4e68-b29f-47e270efa265" (UID: "a045df0b-e5d4-4e68-b29f-47e270efa265"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.034537 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a045df0b-e5d4-4e68-b29f-47e270efa265" (UID: "a045df0b-e5d4-4e68-b29f-47e270efa265"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.041649 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a045df0b-e5d4-4e68-b29f-47e270efa265" (UID: "a045df0b-e5d4-4e68-b29f-47e270efa265"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.043611 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-scripts" (OuterVolumeSpecName: "scripts") pod "a045df0b-e5d4-4e68-b29f-47e270efa265" (UID: "a045df0b-e5d4-4e68-b29f-47e270efa265"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.116993 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.117026 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.117036 4892 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.117048 4892 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a045df0b-e5d4-4e68-b29f-47e270efa265-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.117055 4892 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a045df0b-e5d4-4e68-b29f-47e270efa265-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.117065 4892 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a045df0b-e5d4-4e68-b29f-47e270efa265-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.117073 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h8rc\" (UniqueName: \"kubernetes.io/projected/a045df0b-e5d4-4e68-b29f-47e270efa265-kube-api-access-5h8rc\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.545111 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.545496 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-xjptv" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.545504 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-xjptv" event={"ID":"a045df0b-e5d4-4e68-b29f-47e270efa265","Type":"ContainerDied","Data":"11bbb872b71dfaea235175a44ec85f963232278b6a34f0537a25dc236c672320"} Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.545550 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11bbb872b71dfaea235175a44ec85f963232278b6a34f0537a25dc236c672320" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.575534 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.595858 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.623977 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624316 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" containerName="dnsmasq-dns" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624366 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" containerName="dnsmasq-dns" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624374 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a660a565-71b7-4fd3-8864-f633a0dc1240" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624383 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a660a565-71b7-4fd3-8864-f633a0dc1240" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624424 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30982372-73ba-48f1-b3b3-541d8c51d6ce" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624433 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="30982372-73ba-48f1-b3b3-541d8c51d6ce" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624445 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5823de-6b73-4608-b37e-031dc44dc68b" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624453 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5823de-6b73-4608-b37e-031dc44dc68b" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624465 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="init-config-reloader" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624471 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="init-config-reloader" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624480 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="thanos-sidecar" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624486 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="thanos-sidecar" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624500 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="config-reloader" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624507 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="config-reloader" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624517 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9aee1c6-1d4d-4fd4-9aee-2760312e0e63" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624524 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9aee1c6-1d4d-4fd4-9aee-2760312e0e63" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624539 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" containerName="init" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624546 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" containerName="init" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624556 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="prometheus" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624563 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="prometheus" Oct 06 12:25:22 crc kubenswrapper[4892]: E1006 12:25:22.624575 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a045df0b-e5d4-4e68-b29f-47e270efa265" containerName="swift-ring-rebalance" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624582 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a045df0b-e5d4-4e68-b29f-47e270efa265" containerName="swift-ring-rebalance" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624750 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9aee1c6-1d4d-4fd4-9aee-2760312e0e63" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624775 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa73d237-4fcb-45b6-b394-fb9295df0e2d" containerName="dnsmasq-dns" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624786 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="config-reloader" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624796 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5823de-6b73-4608-b37e-031dc44dc68b" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624808 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a660a565-71b7-4fd3-8864-f633a0dc1240" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624817 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a045df0b-e5d4-4e68-b29f-47e270efa265" containerName="swift-ring-rebalance" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624826 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="thanos-sidecar" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624836 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" containerName="prometheus" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.624847 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="30982372-73ba-48f1-b3b3-541d8c51d6ce" containerName="mariadb-database-create" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.626275 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.628858 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.641134 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.641409 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-x7smc" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.641524 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.641674 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.645896 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.649086 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.651520 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.726786 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9bjb\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-kube-api-access-q9bjb\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727168 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-config\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727215 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727234 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10380cce-a552-488a-8157-ea8425662776-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727262 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727283 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727370 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10380cce-a552-488a-8157-ea8425662776-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727460 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727488 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727511 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.727606 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829123 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829186 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829218 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829246 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829280 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9bjb\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-kube-api-access-q9bjb\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829302 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-config\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829355 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829374 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10380cce-a552-488a-8157-ea8425662776-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829407 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829430 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.829462 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10380cce-a552-488a-8157-ea8425662776-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.830278 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10380cce-a552-488a-8157-ea8425662776-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.832595 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.832623 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b0f201420d0adafcb475a965fbfd99b4a272413cc10e31ea76ae8257a696a4f5/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.834528 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.834865 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.835723 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.835956 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.838363 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10380cce-a552-488a-8157-ea8425662776-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.838437 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-config\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.841533 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.843430 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.848430 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9bjb\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-kube-api-access-q9bjb\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.877185 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.947693 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.985117 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:25:22 crc kubenswrapper[4892]: I1006 12:25:22.985427 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:25:23 crc kubenswrapper[4892]: I1006 12:25:23.419820 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:25:23 crc kubenswrapper[4892]: I1006 12:25:23.556572 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerStarted","Data":"1d082586dc4ded12b3a47603e3687cdca460014ec3a897ccea43b6bebab2141a"} Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.194944 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83392b37-0087-4c7c-ab0d-91af0c170445" path="/var/lib/kubelet/pods/83392b37-0087-4c7c-ab0d-91af0c170445/volumes" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.389854 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l6mw2" podUID="cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8" containerName="ovn-controller" probeResult="failure" output=< Oct 06 12:25:24 crc kubenswrapper[4892]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 06 12:25:24 crc kubenswrapper[4892]: > Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.397668 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.439882 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qtgzv" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.569478 4892 generic.go:334] "Generic (PLEG): container finished" podID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerID="f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076" exitCode=0 Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.569586 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c","Type":"ContainerDied","Data":"f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076"} Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.572094 4892 generic.go:334] "Generic (PLEG): container finished" podID="000efd26-a8c0-4668-9603-9ee7a9aed0ed" containerID="4cda7faae9d03cb110f0480c21e359deb1758a2d96ca327e2884fff0bb5b4f5b" exitCode=0 Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.572138 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"000efd26-a8c0-4668-9603-9ee7a9aed0ed","Type":"ContainerDied","Data":"4cda7faae9d03cb110f0480c21e359deb1758a2d96ca327e2884fff0bb5b4f5b"} Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.576658 4892 generic.go:334] "Generic (PLEG): container finished" podID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerID="7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2" exitCode=0 Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.576762 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bcc90cdb-7f84-4923-9eef-4fae34199b75","Type":"ContainerDied","Data":"7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2"} Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.713179 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l6mw2-config-8lfkm"] Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.714817 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.719762 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.736221 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l6mw2-config-8lfkm"] Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.768377 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-log-ovn\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.768546 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2ff4\" (UniqueName: \"kubernetes.io/projected/da651abe-1c63-4172-92d6-dc0b424bbacb-kube-api-access-w2ff4\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.768658 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-scripts\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.768814 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run-ovn\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.768919 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.768959 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-additional-scripts\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.870648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2ff4\" (UniqueName: \"kubernetes.io/projected/da651abe-1c63-4172-92d6-dc0b424bbacb-kube-api-access-w2ff4\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.871488 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-scripts\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.873645 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run-ovn\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.873772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.873833 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-additional-scripts\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.873897 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-log-ovn\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.874153 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-scripts\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.874438 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run-ovn\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.874441 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.874439 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-log-ovn\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.874865 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-additional-scripts\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:24 crc kubenswrapper[4892]: I1006 12:25:24.894140 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2ff4\" (UniqueName: \"kubernetes.io/projected/da651abe-1c63-4172-92d6-dc0b424bbacb-kube-api-access-w2ff4\") pod \"ovn-controller-l6mw2-config-8lfkm\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.081876 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.594767 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bcc90cdb-7f84-4923-9eef-4fae34199b75","Type":"ContainerStarted","Data":"48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4"} Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.595043 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.598049 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c","Type":"ContainerStarted","Data":"675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263"} Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.598244 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.601688 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"000efd26-a8c0-4668-9603-9ee7a9aed0ed","Type":"ContainerStarted","Data":"488e1ab9be81ae88d9be3c0b22cd5157aad7fe09b1097dfd3af97ea4b25acae7"} Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.601942 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.644999 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l6mw2-config-8lfkm"] Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.645423 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.09890958 podStartE2EDuration="1m2.645406978s" podCreationTimestamp="2025-10-06 12:24:23 +0000 UTC" firstStartedPulling="2025-10-06 12:24:37.855907928 +0000 UTC m=+964.405613683" lastFinishedPulling="2025-10-06 12:24:47.402405316 +0000 UTC m=+973.952111081" observedRunningTime="2025-10-06 12:25:25.636740987 +0000 UTC m=+1012.186446782" watchObservedRunningTime="2025-10-06 12:25:25.645406978 +0000 UTC m=+1012.195112743" Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.696576 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=53.082594616 podStartE2EDuration="1m2.696557767s" podCreationTimestamp="2025-10-06 12:24:23 +0000 UTC" firstStartedPulling="2025-10-06 12:24:38.732018484 +0000 UTC m=+965.281724249" lastFinishedPulling="2025-10-06 12:24:48.345981625 +0000 UTC m=+974.895687400" observedRunningTime="2025-10-06 12:25:25.691939948 +0000 UTC m=+1012.241645723" watchObservedRunningTime="2025-10-06 12:25:25.696557767 +0000 UTC m=+1012.246263542" Oct 06 12:25:25 crc kubenswrapper[4892]: I1006 12:25:25.719380 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.34990404 podStartE2EDuration="1m2.719365193s" podCreationTimestamp="2025-10-06 12:24:23 +0000 UTC" firstStartedPulling="2025-10-06 12:24:38.735788619 +0000 UTC m=+965.285494414" lastFinishedPulling="2025-10-06 12:24:48.105249792 +0000 UTC m=+974.654955567" observedRunningTime="2025-10-06 12:25:25.712186667 +0000 UTC m=+1012.261892452" watchObservedRunningTime="2025-10-06 12:25:25.719365193 +0000 UTC m=+1012.269070958" Oct 06 12:25:26 crc kubenswrapper[4892]: I1006 12:25:26.615104 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerStarted","Data":"de88c8c1c78db93f82cbf007c5d171de907a99ee8d91ee70708b4b1725246986"} Oct 06 12:25:26 crc kubenswrapper[4892]: I1006 12:25:26.618192 4892 generic.go:334] "Generic (PLEG): container finished" podID="da651abe-1c63-4172-92d6-dc0b424bbacb" containerID="ba8d2ebf65b90cf60310ced4e4fa82489e0b79f9e8a427798a4590c8aa265643" exitCode=0 Oct 06 12:25:26 crc kubenswrapper[4892]: I1006 12:25:26.618255 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l6mw2-config-8lfkm" event={"ID":"da651abe-1c63-4172-92d6-dc0b424bbacb","Type":"ContainerDied","Data":"ba8d2ebf65b90cf60310ced4e4fa82489e0b79f9e8a427798a4590c8aa265643"} Oct 06 12:25:26 crc kubenswrapper[4892]: I1006 12:25:26.618301 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l6mw2-config-8lfkm" event={"ID":"da651abe-1c63-4172-92d6-dc0b424bbacb","Type":"ContainerStarted","Data":"2ebeca7f9d429aa597e0f0ec8f27690e5998ebf66822d02d4f67107f2ac577ef"} Oct 06 12:25:27 crc kubenswrapper[4892]: I1006 12:25:27.992451 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.135911 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2ff4\" (UniqueName: \"kubernetes.io/projected/da651abe-1c63-4172-92d6-dc0b424bbacb-kube-api-access-w2ff4\") pod \"da651abe-1c63-4172-92d6-dc0b424bbacb\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136002 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-log-ovn\") pod \"da651abe-1c63-4172-92d6-dc0b424bbacb\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136029 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run-ovn\") pod \"da651abe-1c63-4172-92d6-dc0b424bbacb\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136066 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-scripts\") pod \"da651abe-1c63-4172-92d6-dc0b424bbacb\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136088 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "da651abe-1c63-4172-92d6-dc0b424bbacb" (UID: "da651abe-1c63-4172-92d6-dc0b424bbacb"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136103 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-additional-scripts\") pod \"da651abe-1c63-4172-92d6-dc0b424bbacb\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136118 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "da651abe-1c63-4172-92d6-dc0b424bbacb" (UID: "da651abe-1c63-4172-92d6-dc0b424bbacb"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136244 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run\") pod \"da651abe-1c63-4172-92d6-dc0b424bbacb\" (UID: \"da651abe-1c63-4172-92d6-dc0b424bbacb\") " Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136385 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run" (OuterVolumeSpecName: "var-run") pod "da651abe-1c63-4172-92d6-dc0b424bbacb" (UID: "da651abe-1c63-4172-92d6-dc0b424bbacb"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136733 4892 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136756 4892 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136766 4892 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/da651abe-1c63-4172-92d6-dc0b424bbacb-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136771 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "da651abe-1c63-4172-92d6-dc0b424bbacb" (UID: "da651abe-1c63-4172-92d6-dc0b424bbacb"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.136925 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-scripts" (OuterVolumeSpecName: "scripts") pod "da651abe-1c63-4172-92d6-dc0b424bbacb" (UID: "da651abe-1c63-4172-92d6-dc0b424bbacb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.143592 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da651abe-1c63-4172-92d6-dc0b424bbacb-kube-api-access-w2ff4" (OuterVolumeSpecName: "kube-api-access-w2ff4") pod "da651abe-1c63-4172-92d6-dc0b424bbacb" (UID: "da651abe-1c63-4172-92d6-dc0b424bbacb"). InnerVolumeSpecName "kube-api-access-w2ff4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.238236 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.238283 4892 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/da651abe-1c63-4172-92d6-dc0b424bbacb-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.238301 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2ff4\" (UniqueName: \"kubernetes.io/projected/da651abe-1c63-4172-92d6-dc0b424bbacb-kube-api-access-w2ff4\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.244216 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-523d-account-create-4mr87"] Oct 06 12:25:28 crc kubenswrapper[4892]: E1006 12:25:28.244656 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da651abe-1c63-4172-92d6-dc0b424bbacb" containerName="ovn-config" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.244675 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="da651abe-1c63-4172-92d6-dc0b424bbacb" containerName="ovn-config" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.244861 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="da651abe-1c63-4172-92d6-dc0b424bbacb" containerName="ovn-config" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.245483 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-523d-account-create-4mr87" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.255623 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.258027 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-523d-account-create-4mr87"] Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.341284 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6xvc\" (UniqueName: \"kubernetes.io/projected/5f7f1fb6-763b-45fa-87dd-027c5397ed92-kube-api-access-s6xvc\") pod \"keystone-523d-account-create-4mr87\" (UID: \"5f7f1fb6-763b-45fa-87dd-027c5397ed92\") " pod="openstack/keystone-523d-account-create-4mr87" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.444280 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6xvc\" (UniqueName: \"kubernetes.io/projected/5f7f1fb6-763b-45fa-87dd-027c5397ed92-kube-api-access-s6xvc\") pod \"keystone-523d-account-create-4mr87\" (UID: \"5f7f1fb6-763b-45fa-87dd-027c5397ed92\") " pod="openstack/keystone-523d-account-create-4mr87" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.472405 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6xvc\" (UniqueName: \"kubernetes.io/projected/5f7f1fb6-763b-45fa-87dd-027c5397ed92-kube-api-access-s6xvc\") pod \"keystone-523d-account-create-4mr87\" (UID: \"5f7f1fb6-763b-45fa-87dd-027c5397ed92\") " pod="openstack/keystone-523d-account-create-4mr87" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.548747 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-90fa-account-create-vgbzz"] Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.550150 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-90fa-account-create-vgbzz" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.552890 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.559620 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-90fa-account-create-vgbzz"] Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.560588 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-523d-account-create-4mr87" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.637857 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l6mw2-config-8lfkm" event={"ID":"da651abe-1c63-4172-92d6-dc0b424bbacb","Type":"ContainerDied","Data":"2ebeca7f9d429aa597e0f0ec8f27690e5998ebf66822d02d4f67107f2ac577ef"} Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.637911 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ebeca7f9d429aa597e0f0ec8f27690e5998ebf66822d02d4f67107f2ac577ef" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.637921 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l6mw2-config-8lfkm" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.649615 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhlts\" (UniqueName: \"kubernetes.io/projected/3d0c75fb-89d3-494d-a468-01293842310b-kube-api-access-jhlts\") pod \"placement-90fa-account-create-vgbzz\" (UID: \"3d0c75fb-89d3-494d-a468-01293842310b\") " pod="openstack/placement-90fa-account-create-vgbzz" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.756982 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhlts\" (UniqueName: \"kubernetes.io/projected/3d0c75fb-89d3-494d-a468-01293842310b-kube-api-access-jhlts\") pod \"placement-90fa-account-create-vgbzz\" (UID: \"3d0c75fb-89d3-494d-a468-01293842310b\") " pod="openstack/placement-90fa-account-create-vgbzz" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.772907 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-04c9-account-create-mqr4p"] Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.774468 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-04c9-account-create-mqr4p" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.776556 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.784093 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhlts\" (UniqueName: \"kubernetes.io/projected/3d0c75fb-89d3-494d-a468-01293842310b-kube-api-access-jhlts\") pod \"placement-90fa-account-create-vgbzz\" (UID: \"3d0c75fb-89d3-494d-a468-01293842310b\") " pod="openstack/placement-90fa-account-create-vgbzz" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.787175 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-04c9-account-create-mqr4p"] Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.871636 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-90fa-account-create-vgbzz" Oct 06 12:25:28 crc kubenswrapper[4892]: I1006 12:25:28.960457 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ss5f\" (UniqueName: \"kubernetes.io/projected/9a658d1b-748b-4345-881e-54b1369b86d0-kube-api-access-8ss5f\") pod \"glance-04c9-account-create-mqr4p\" (UID: \"9a658d1b-748b-4345-881e-54b1369b86d0\") " pod="openstack/glance-04c9-account-create-mqr4p" Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.047911 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-523d-account-create-4mr87"] Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.063629 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ss5f\" (UniqueName: \"kubernetes.io/projected/9a658d1b-748b-4345-881e-54b1369b86d0-kube-api-access-8ss5f\") pod \"glance-04c9-account-create-mqr4p\" (UID: \"9a658d1b-748b-4345-881e-54b1369b86d0\") " pod="openstack/glance-04c9-account-create-mqr4p" Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.092857 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ss5f\" (UniqueName: \"kubernetes.io/projected/9a658d1b-748b-4345-881e-54b1369b86d0-kube-api-access-8ss5f\") pod \"glance-04c9-account-create-mqr4p\" (UID: \"9a658d1b-748b-4345-881e-54b1369b86d0\") " pod="openstack/glance-04c9-account-create-mqr4p" Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.103748 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-04c9-account-create-mqr4p" Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.128387 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-l6mw2-config-8lfkm"] Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.133376 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-l6mw2-config-8lfkm"] Oct 06 12:25:29 crc kubenswrapper[4892]: W1006 12:25:29.313916 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d0c75fb_89d3_494d_a468_01293842310b.slice/crio-918a39ff6cf68e826165eb0d0d9041b70d6289405ed516fccdca2394f5b2353e WatchSource:0}: Error finding container 918a39ff6cf68e826165eb0d0d9041b70d6289405ed516fccdca2394f5b2353e: Status 404 returned error can't find the container with id 918a39ff6cf68e826165eb0d0d9041b70d6289405ed516fccdca2394f5b2353e Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.316143 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-90fa-account-create-vgbzz"] Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.365811 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-l6mw2" Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.534515 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-04c9-account-create-mqr4p"] Oct 06 12:25:29 crc kubenswrapper[4892]: W1006 12:25:29.537637 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a658d1b_748b_4345_881e_54b1369b86d0.slice/crio-42a33f0f6529bb99abb144accac9c01c708c4e191461b561bd772bafe1104f61 WatchSource:0}: Error finding container 42a33f0f6529bb99abb144accac9c01c708c4e191461b561bd772bafe1104f61: Status 404 returned error can't find the container with id 42a33f0f6529bb99abb144accac9c01c708c4e191461b561bd772bafe1104f61 Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.648772 4892 generic.go:334] "Generic (PLEG): container finished" podID="5f7f1fb6-763b-45fa-87dd-027c5397ed92" containerID="032bfdb681258912b6d32e8e1384c7156e9becfc6f82fd17ea605531be34a1ef" exitCode=0 Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.648830 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-523d-account-create-4mr87" event={"ID":"5f7f1fb6-763b-45fa-87dd-027c5397ed92","Type":"ContainerDied","Data":"032bfdb681258912b6d32e8e1384c7156e9becfc6f82fd17ea605531be34a1ef"} Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.648900 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-523d-account-create-4mr87" event={"ID":"5f7f1fb6-763b-45fa-87dd-027c5397ed92","Type":"ContainerStarted","Data":"6a1d92a4411fcb8492ab37e175ad551fdba321fe2e318d1642f3af0fc0ae51b6"} Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.650379 4892 generic.go:334] "Generic (PLEG): container finished" podID="3d0c75fb-89d3-494d-a468-01293842310b" containerID="1d109eef19210b4534f30b697185078e0fef41e0be75d1cb89d96b91fcad67c8" exitCode=0 Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.650488 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-90fa-account-create-vgbzz" event={"ID":"3d0c75fb-89d3-494d-a468-01293842310b","Type":"ContainerDied","Data":"1d109eef19210b4534f30b697185078e0fef41e0be75d1cb89d96b91fcad67c8"} Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.650567 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-90fa-account-create-vgbzz" event={"ID":"3d0c75fb-89d3-494d-a468-01293842310b","Type":"ContainerStarted","Data":"918a39ff6cf68e826165eb0d0d9041b70d6289405ed516fccdca2394f5b2353e"} Oct 06 12:25:29 crc kubenswrapper[4892]: I1006 12:25:29.651516 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-04c9-account-create-mqr4p" event={"ID":"9a658d1b-748b-4345-881e-54b1369b86d0","Type":"ContainerStarted","Data":"42a33f0f6529bb99abb144accac9c01c708c4e191461b561bd772bafe1104f61"} Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.185960 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da651abe-1c63-4172-92d6-dc0b424bbacb" path="/var/lib/kubelet/pods/da651abe-1c63-4172-92d6-dc0b424bbacb/volumes" Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.532558 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-1d51-account-create-cmxch"] Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.534996 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1d51-account-create-cmxch" Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.538195 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.540490 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-1d51-account-create-cmxch"] Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.663235 4892 generic.go:334] "Generic (PLEG): container finished" podID="9a658d1b-748b-4345-881e-54b1369b86d0" containerID="d775c5f4f1f8c92b811f7c6014bf303282cda0fcea766855a5889d929c0cddd1" exitCode=0 Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.663311 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-04c9-account-create-mqr4p" event={"ID":"9a658d1b-748b-4345-881e-54b1369b86d0","Type":"ContainerDied","Data":"d775c5f4f1f8c92b811f7c6014bf303282cda0fcea766855a5889d929c0cddd1"} Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.709717 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqstw\" (UniqueName: \"kubernetes.io/projected/2d216276-2620-49c2-8be9-05784aca5d45-kube-api-access-qqstw\") pod \"watcher-1d51-account-create-cmxch\" (UID: \"2d216276-2620-49c2-8be9-05784aca5d45\") " pod="openstack/watcher-1d51-account-create-cmxch" Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.810549 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqstw\" (UniqueName: \"kubernetes.io/projected/2d216276-2620-49c2-8be9-05784aca5d45-kube-api-access-qqstw\") pod \"watcher-1d51-account-create-cmxch\" (UID: \"2d216276-2620-49c2-8be9-05784aca5d45\") " pod="openstack/watcher-1d51-account-create-cmxch" Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.834411 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqstw\" (UniqueName: \"kubernetes.io/projected/2d216276-2620-49c2-8be9-05784aca5d45-kube-api-access-qqstw\") pod \"watcher-1d51-account-create-cmxch\" (UID: \"2d216276-2620-49c2-8be9-05784aca5d45\") " pod="openstack/watcher-1d51-account-create-cmxch" Oct 06 12:25:30 crc kubenswrapper[4892]: I1006 12:25:30.866045 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1d51-account-create-cmxch" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.110850 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-90fa-account-create-vgbzz" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.113596 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhlts\" (UniqueName: \"kubernetes.io/projected/3d0c75fb-89d3-494d-a468-01293842310b-kube-api-access-jhlts\") pod \"3d0c75fb-89d3-494d-a468-01293842310b\" (UID: \"3d0c75fb-89d3-494d-a468-01293842310b\") " Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.128660 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d0c75fb-89d3-494d-a468-01293842310b-kube-api-access-jhlts" (OuterVolumeSpecName: "kube-api-access-jhlts") pod "3d0c75fb-89d3-494d-a468-01293842310b" (UID: "3d0c75fb-89d3-494d-a468-01293842310b"). InnerVolumeSpecName "kube-api-access-jhlts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.205486 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-523d-account-create-4mr87" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.214203 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6xvc\" (UniqueName: \"kubernetes.io/projected/5f7f1fb6-763b-45fa-87dd-027c5397ed92-kube-api-access-s6xvc\") pod \"5f7f1fb6-763b-45fa-87dd-027c5397ed92\" (UID: \"5f7f1fb6-763b-45fa-87dd-027c5397ed92\") " Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.214490 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhlts\" (UniqueName: \"kubernetes.io/projected/3d0c75fb-89d3-494d-a468-01293842310b-kube-api-access-jhlts\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.219622 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7f1fb6-763b-45fa-87dd-027c5397ed92-kube-api-access-s6xvc" (OuterVolumeSpecName: "kube-api-access-s6xvc") pod "5f7f1fb6-763b-45fa-87dd-027c5397ed92" (UID: "5f7f1fb6-763b-45fa-87dd-027c5397ed92"). InnerVolumeSpecName "kube-api-access-s6xvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.316592 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6xvc\" (UniqueName: \"kubernetes.io/projected/5f7f1fb6-763b-45fa-87dd-027c5397ed92-kube-api-access-s6xvc\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.367003 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-1d51-account-create-cmxch"] Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.672700 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-523d-account-create-4mr87" event={"ID":"5f7f1fb6-763b-45fa-87dd-027c5397ed92","Type":"ContainerDied","Data":"6a1d92a4411fcb8492ab37e175ad551fdba321fe2e318d1642f3af0fc0ae51b6"} Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.673015 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a1d92a4411fcb8492ab37e175ad551fdba321fe2e318d1642f3af0fc0ae51b6" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.672770 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-523d-account-create-4mr87" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.675605 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-90fa-account-create-vgbzz" event={"ID":"3d0c75fb-89d3-494d-a468-01293842310b","Type":"ContainerDied","Data":"918a39ff6cf68e826165eb0d0d9041b70d6289405ed516fccdca2394f5b2353e"} Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.675656 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="918a39ff6cf68e826165eb0d0d9041b70d6289405ed516fccdca2394f5b2353e" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.675725 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-90fa-account-create-vgbzz" Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.677923 4892 generic.go:334] "Generic (PLEG): container finished" podID="2d216276-2620-49c2-8be9-05784aca5d45" containerID="c7e1f580ebc331249c116ea91a1fd22f44f3b23aeb06e42884fc192888f4315f" exitCode=0 Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.677981 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1d51-account-create-cmxch" event={"ID":"2d216276-2620-49c2-8be9-05784aca5d45","Type":"ContainerDied","Data":"c7e1f580ebc331249c116ea91a1fd22f44f3b23aeb06e42884fc192888f4315f"} Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.678046 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1d51-account-create-cmxch" event={"ID":"2d216276-2620-49c2-8be9-05784aca5d45","Type":"ContainerStarted","Data":"2ee9df330f1780bd023cc6fe6c074fb3372c5d3165e7fbdce3d968fc5bbd99f8"} Oct 06 12:25:31 crc kubenswrapper[4892]: I1006 12:25:31.953114 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-04c9-account-create-mqr4p" Oct 06 12:25:32 crc kubenswrapper[4892]: I1006 12:25:32.026941 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ss5f\" (UniqueName: \"kubernetes.io/projected/9a658d1b-748b-4345-881e-54b1369b86d0-kube-api-access-8ss5f\") pod \"9a658d1b-748b-4345-881e-54b1369b86d0\" (UID: \"9a658d1b-748b-4345-881e-54b1369b86d0\") " Oct 06 12:25:32 crc kubenswrapper[4892]: I1006 12:25:32.033979 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a658d1b-748b-4345-881e-54b1369b86d0-kube-api-access-8ss5f" (OuterVolumeSpecName: "kube-api-access-8ss5f") pod "9a658d1b-748b-4345-881e-54b1369b86d0" (UID: "9a658d1b-748b-4345-881e-54b1369b86d0"). InnerVolumeSpecName "kube-api-access-8ss5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:32 crc kubenswrapper[4892]: I1006 12:25:32.129184 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ss5f\" (UniqueName: \"kubernetes.io/projected/9a658d1b-748b-4345-881e-54b1369b86d0-kube-api-access-8ss5f\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:32 crc kubenswrapper[4892]: I1006 12:25:32.692822 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-04c9-account-create-mqr4p" Oct 06 12:25:32 crc kubenswrapper[4892]: I1006 12:25:32.692814 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-04c9-account-create-mqr4p" event={"ID":"9a658d1b-748b-4345-881e-54b1369b86d0","Type":"ContainerDied","Data":"42a33f0f6529bb99abb144accac9c01c708c4e191461b561bd772bafe1104f61"} Oct 06 12:25:32 crc kubenswrapper[4892]: I1006 12:25:32.693295 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42a33f0f6529bb99abb144accac9c01c708c4e191461b561bd772bafe1104f61" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.096577 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1d51-account-create-cmxch" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.252336 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqstw\" (UniqueName: \"kubernetes.io/projected/2d216276-2620-49c2-8be9-05784aca5d45-kube-api-access-qqstw\") pod \"2d216276-2620-49c2-8be9-05784aca5d45\" (UID: \"2d216276-2620-49c2-8be9-05784aca5d45\") " Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.256883 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d216276-2620-49c2-8be9-05784aca5d45-kube-api-access-qqstw" (OuterVolumeSpecName: "kube-api-access-qqstw") pod "2d216276-2620-49c2-8be9-05784aca5d45" (UID: "2d216276-2620-49c2-8be9-05784aca5d45"). InnerVolumeSpecName "kube-api-access-qqstw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.353742 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.353862 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqstw\" (UniqueName: \"kubernetes.io/projected/2d216276-2620-49c2-8be9-05784aca5d45-kube-api-access-qqstw\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.361378 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9f90d8be-05b7-4668-be7c-1494621a363b-etc-swift\") pod \"swift-storage-0\" (UID: \"9f90d8be-05b7-4668-be7c-1494621a363b\") " pod="openstack/swift-storage-0" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.616988 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.710444 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-1d51-account-create-cmxch" event={"ID":"2d216276-2620-49c2-8be9-05784aca5d45","Type":"ContainerDied","Data":"2ee9df330f1780bd023cc6fe6c074fb3372c5d3165e7fbdce3d968fc5bbd99f8"} Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.710488 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ee9df330f1780bd023cc6fe6c074fb3372c5d3165e7fbdce3d968fc5bbd99f8" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.710513 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-1d51-account-create-cmxch" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.810840 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-nvxxf"] Oct 06 12:25:33 crc kubenswrapper[4892]: E1006 12:25:33.811165 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f7f1fb6-763b-45fa-87dd-027c5397ed92" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.811176 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7f1fb6-763b-45fa-87dd-027c5397ed92" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: E1006 12:25:33.811200 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d216276-2620-49c2-8be9-05784aca5d45" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.811207 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d216276-2620-49c2-8be9-05784aca5d45" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: E1006 12:25:33.811220 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a658d1b-748b-4345-881e-54b1369b86d0" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.811242 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a658d1b-748b-4345-881e-54b1369b86d0" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: E1006 12:25:33.811253 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0c75fb-89d3-494d-a468-01293842310b" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.811258 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0c75fb-89d3-494d-a468-01293842310b" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.811479 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a658d1b-748b-4345-881e-54b1369b86d0" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.811489 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f7f1fb6-763b-45fa-87dd-027c5397ed92" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.811508 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0c75fb-89d3-494d-a468-01293842310b" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.811523 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d216276-2620-49c2-8be9-05784aca5d45" containerName="mariadb-account-create" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.812078 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.814282 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f6nmd" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.817142 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.831158 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nvxxf"] Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.972116 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-combined-ca-bundle\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.972187 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v666\" (UniqueName: \"kubernetes.io/projected/21c27657-8048-4b06-9079-4e89e71f369e-kube-api-access-2v666\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.972215 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-db-sync-config-data\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:33 crc kubenswrapper[4892]: I1006 12:25:33.972282 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-config-data\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.074311 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-combined-ca-bundle\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.074708 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v666\" (UniqueName: \"kubernetes.io/projected/21c27657-8048-4b06-9079-4e89e71f369e-kube-api-access-2v666\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.074751 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-db-sync-config-data\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.074850 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-config-data\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.079865 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-combined-ca-bundle\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.086734 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-db-sync-config-data\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.093405 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-config-data\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.095738 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v666\" (UniqueName: \"kubernetes.io/projected/21c27657-8048-4b06-9079-4e89e71f369e-kube-api-access-2v666\") pod \"glance-db-sync-nvxxf\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.128407 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f6nmd" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.138599 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.264463 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.486247 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="000efd26-a8c0-4668-9603-9ee7a9aed0ed" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.109:5671: connect: connection refused" Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.724938 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nvxxf"] Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.728906 4892 generic.go:334] "Generic (PLEG): container finished" podID="10380cce-a552-488a-8157-ea8425662776" containerID="de88c8c1c78db93f82cbf007c5d171de907a99ee8d91ee70708b4b1725246986" exitCode=0 Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.728956 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerDied","Data":"de88c8c1c78db93f82cbf007c5d171de907a99ee8d91ee70708b4b1725246986"} Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.733155 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"cfc4628ef6b05b05558419fceee7b9c66a1581f48752c2e371999734ad4f960c"} Oct 06 12:25:34 crc kubenswrapper[4892]: I1006 12:25:34.790528 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.110:5671: connect: connection refused" Oct 06 12:25:34 crc kubenswrapper[4892]: W1006 12:25:34.962835 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21c27657_8048_4b06_9079_4e89e71f369e.slice/crio-12612ec85e6eed7103d8f1a265ff5d3b35cd481a7d4f593b57fda74808f3ff4e WatchSource:0}: Error finding container 12612ec85e6eed7103d8f1a265ff5d3b35cd481a7d4f593b57fda74808f3ff4e: Status 404 returned error can't find the container with id 12612ec85e6eed7103d8f1a265ff5d3b35cd481a7d4f593b57fda74808f3ff4e Oct 06 12:25:35 crc kubenswrapper[4892]: I1006 12:25:35.014002 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.111:5671: connect: connection refused" Oct 06 12:25:35 crc kubenswrapper[4892]: I1006 12:25:35.743594 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"dd4fd64a16939d2068603e7041887e1d6a546ad110f2d917a44e842c81124dd4"} Oct 06 12:25:35 crc kubenswrapper[4892]: I1006 12:25:35.743890 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"a64e49ce8d2b7625869c8cc41f55ade1b4c058eb2176a602024e21cb45cacf33"} Oct 06 12:25:35 crc kubenswrapper[4892]: I1006 12:25:35.743901 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"c8604d8b94610b3d26b6259270308cb2dcbb1ed7e22e3aecfdbf683a30002461"} Oct 06 12:25:35 crc kubenswrapper[4892]: I1006 12:25:35.746050 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nvxxf" event={"ID":"21c27657-8048-4b06-9079-4e89e71f369e","Type":"ContainerStarted","Data":"12612ec85e6eed7103d8f1a265ff5d3b35cd481a7d4f593b57fda74808f3ff4e"} Oct 06 12:25:35 crc kubenswrapper[4892]: I1006 12:25:35.750166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerStarted","Data":"0cee2e277f10a258841ce91b152eb5485f29229b28557888fb3b6f98b8b3e42e"} Oct 06 12:25:36 crc kubenswrapper[4892]: I1006 12:25:36.762315 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"7ddbab68899cbf0236f65ccb233aea2909fd0c7fe015bd8e6f507dc4558992c3"} Oct 06 12:25:36 crc kubenswrapper[4892]: I1006 12:25:36.762656 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"9b741e052156ebf32d4f949d0fefaffe0858430a657d631f8749d2de441a8a0a"} Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.773467 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerStarted","Data":"6174bee3a7c656575e63cd034f8f5f86c38d4548b4770b38f3d52c08f17f05be"} Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.773720 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerStarted","Data":"9bf96ce1640bcc275470f6e5513f22f48c5d3c0e0f3ca083a65b56c0dc15231d"} Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.780786 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"1c2ce3e7c0300247eb3efb128d9a35b7f80de2e394e78c9f39647c8f2a2e4bb8"} Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.780821 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"9ef0851598948afa437ce1e3295c73b5bca58d5b120f2fa8390941e3047d2190"} Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.780830 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"fea81b6ad00ac776e7df179e883a69d1fd49bab9f7e0d4c2340a4f9a97b44d74"} Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.799615 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.799599553 podStartE2EDuration="15.799599553s" podCreationTimestamp="2025-10-06 12:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:25:37.792887731 +0000 UTC m=+1024.342593496" watchObservedRunningTime="2025-10-06 12:25:37.799599553 +0000 UTC m=+1024.349305318" Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.948197 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.948348 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:37 crc kubenswrapper[4892]: I1006 12:25:37.954305 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:38 crc kubenswrapper[4892]: I1006 12:25:38.795646 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"d25b9d4b8804062a3a7147a7db34ee94ef51a9abf78656ea8574d9ad303a36be"} Oct 06 12:25:38 crc kubenswrapper[4892]: I1006 12:25:38.796253 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"022c0f2f7bc2faf10cf4c8137b1fb3501160dbb382244a8ac34e99cf3f9ba3d9"} Oct 06 12:25:38 crc kubenswrapper[4892]: I1006 12:25:38.796272 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"548758a96adce2db315473a1d4731b6f94ac94507a428e679d0440b80d19f264"} Oct 06 12:25:38 crc kubenswrapper[4892]: I1006 12:25:38.800088 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 06 12:25:39 crc kubenswrapper[4892]: I1006 12:25:39.822462 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"fefde8996cc3b83d51c89bc6ef22ea7653ed222053eefc014da95d96a0b8194c"} Oct 06 12:25:39 crc kubenswrapper[4892]: I1006 12:25:39.822794 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"0821ec6fc3cdcf0f9d4fed92b3fca50af6466568f4541aaab86fa1992539e034"} Oct 06 12:25:39 crc kubenswrapper[4892]: I1006 12:25:39.822811 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"7a364022e287dd07cd67ae08657eb20380ac9a1fcb48e1432b8516c26f775494"} Oct 06 12:25:39 crc kubenswrapper[4892]: I1006 12:25:39.822820 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"9f90d8be-05b7-4668-be7c-1494621a363b","Type":"ContainerStarted","Data":"88e451ae70c15ef70385f748ca41a38ecad0282208e02cf298206c8892911439"} Oct 06 12:25:39 crc kubenswrapper[4892]: I1006 12:25:39.868720 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.994956729 podStartE2EDuration="39.868698673s" podCreationTimestamp="2025-10-06 12:25:00 +0000 UTC" firstStartedPulling="2025-10-06 12:25:34.263300842 +0000 UTC m=+1020.813006607" lastFinishedPulling="2025-10-06 12:25:38.137042766 +0000 UTC m=+1024.686748551" observedRunningTime="2025-10-06 12:25:39.859393003 +0000 UTC m=+1026.409098778" watchObservedRunningTime="2025-10-06 12:25:39.868698673 +0000 UTC m=+1026.418404448" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.258173 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55ddd64775-5m4mj"] Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.259547 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.261891 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.273978 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55ddd64775-5m4mj"] Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.392374 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-svc\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.392458 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9ssg\" (UniqueName: \"kubernetes.io/projected/725ebf18-e01b-4408-af76-e1c187a5abce-kube-api-access-z9ssg\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.392481 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-config\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.392507 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-nb\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.392545 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-sb\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.392607 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-swift-storage-0\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.493707 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-svc\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.493776 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9ssg\" (UniqueName: \"kubernetes.io/projected/725ebf18-e01b-4408-af76-e1c187a5abce-kube-api-access-z9ssg\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.493792 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-config\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.493812 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-nb\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.493842 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-sb\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.493899 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-swift-storage-0\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.494695 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-swift-storage-0\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.495191 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-svc\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.496012 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-config\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.496500 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-nb\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.496957 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-sb\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.514472 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9ssg\" (UniqueName: \"kubernetes.io/projected/725ebf18-e01b-4408-af76-e1c187a5abce-kube-api-access-z9ssg\") pod \"dnsmasq-dns-55ddd64775-5m4mj\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:40 crc kubenswrapper[4892]: I1006 12:25:40.576702 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:41 crc kubenswrapper[4892]: I1006 12:25:41.035554 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55ddd64775-5m4mj"] Oct 06 12:25:44 crc kubenswrapper[4892]: I1006 12:25:44.487722 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 06 12:25:44 crc kubenswrapper[4892]: I1006 12:25:44.789513 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 12:25:45 crc kubenswrapper[4892]: I1006 12:25:45.013884 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:25:45 crc kubenswrapper[4892]: E1006 12:25:45.653172 4892 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.144:49336->38.102.83.144:40237: write tcp 38.102.83.144:49336->38.102.83.144:40237: write: broken pipe Oct 06 12:25:45 crc kubenswrapper[4892]: I1006 12:25:45.897965 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-zqblg"] Oct 06 12:25:45 crc kubenswrapper[4892]: I1006 12:25:45.898950 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zqblg" Oct 06 12:25:45 crc kubenswrapper[4892]: I1006 12:25:45.935773 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zqblg"] Oct 06 12:25:45 crc kubenswrapper[4892]: I1006 12:25:45.970878 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-zjjqh"] Oct 06 12:25:45 crc kubenswrapper[4892]: I1006 12:25:45.972182 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:45 crc kubenswrapper[4892]: I1006 12:25:45.990828 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqzzd\" (UniqueName: \"kubernetes.io/projected/2d54d8a4-1c52-4e19-a689-e68c7f751bbd-kube-api-access-cqzzd\") pod \"barbican-db-create-zqblg\" (UID: \"2d54d8a4-1c52-4e19-a689-e68c7f751bbd\") " pod="openstack/barbican-db-create-zqblg" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.005978 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-dswng" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.006986 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.013281 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-zjjqh"] Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.092915 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/425bb472-054f-4ce7-8788-f63e794dff02-kube-api-access-5kkf6\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.092977 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-combined-ca-bundle\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.093022 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqzzd\" (UniqueName: \"kubernetes.io/projected/2d54d8a4-1c52-4e19-a689-e68c7f751bbd-kube-api-access-cqzzd\") pod \"barbican-db-create-zqblg\" (UID: \"2d54d8a4-1c52-4e19-a689-e68c7f751bbd\") " pod="openstack/barbican-db-create-zqblg" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.093061 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-config-data\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.093170 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-db-sync-config-data\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.096428 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cccks"] Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.097530 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cccks" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.109179 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cccks"] Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.128075 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqzzd\" (UniqueName: \"kubernetes.io/projected/2d54d8a4-1c52-4e19-a689-e68c7f751bbd-kube-api-access-cqzzd\") pod \"barbican-db-create-zqblg\" (UID: \"2d54d8a4-1c52-4e19-a689-e68c7f751bbd\") " pod="openstack/barbican-db-create-zqblg" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.195482 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-db-sync-config-data\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.195765 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n45c5\" (UniqueName: \"kubernetes.io/projected/2c01bdac-6171-4739-8a4d-93e871a3cbe4-kube-api-access-n45c5\") pod \"cinder-db-create-cccks\" (UID: \"2c01bdac-6171-4739-8a4d-93e871a3cbe4\") " pod="openstack/cinder-db-create-cccks" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.195802 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/425bb472-054f-4ce7-8788-f63e794dff02-kube-api-access-5kkf6\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.195826 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-combined-ca-bundle\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.195852 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-config-data\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.200998 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-combined-ca-bundle\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.201277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-config-data\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.201465 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-db-sync-config-data\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.211152 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/425bb472-054f-4ce7-8788-f63e794dff02-kube-api-access-5kkf6\") pod \"watcher-db-sync-zjjqh\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.217338 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zqblg" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.293445 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-ph7rn"] Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.297495 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n45c5\" (UniqueName: \"kubernetes.io/projected/2c01bdac-6171-4739-8a4d-93e871a3cbe4-kube-api-access-n45c5\") pod \"cinder-db-create-cccks\" (UID: \"2c01bdac-6171-4739-8a4d-93e871a3cbe4\") " pod="openstack/cinder-db-create-cccks" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.297978 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.300510 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ph7rn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.309233 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ph7rn"] Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.323739 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n45c5\" (UniqueName: \"kubernetes.io/projected/2c01bdac-6171-4739-8a4d-93e871a3cbe4-kube-api-access-n45c5\") pod \"cinder-db-create-cccks\" (UID: \"2c01bdac-6171-4739-8a4d-93e871a3cbe4\") " pod="openstack/cinder-db-create-cccks" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.360557 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-pk4nn"] Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.362649 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.364812 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.364928 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.364831 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.364878 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qj9t" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.368646 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-pk4nn"] Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.400046 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvp82\" (UniqueName: \"kubernetes.io/projected/e2bdc381-3838-44ce-bcfc-9cc714973c19-kube-api-access-mvp82\") pod \"neutron-db-create-ph7rn\" (UID: \"e2bdc381-3838-44ce-bcfc-9cc714973c19\") " pod="openstack/neutron-db-create-ph7rn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.400126 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqh9b\" (UniqueName: \"kubernetes.io/projected/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-kube-api-access-dqh9b\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.400164 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-config-data\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.400279 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-combined-ca-bundle\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.419964 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cccks" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.501867 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-combined-ca-bundle\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.501982 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvp82\" (UniqueName: \"kubernetes.io/projected/e2bdc381-3838-44ce-bcfc-9cc714973c19-kube-api-access-mvp82\") pod \"neutron-db-create-ph7rn\" (UID: \"e2bdc381-3838-44ce-bcfc-9cc714973c19\") " pod="openstack/neutron-db-create-ph7rn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.502020 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqh9b\" (UniqueName: \"kubernetes.io/projected/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-kube-api-access-dqh9b\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.502044 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-config-data\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.506735 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-combined-ca-bundle\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.514738 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-config-data\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.521400 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqh9b\" (UniqueName: \"kubernetes.io/projected/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-kube-api-access-dqh9b\") pod \"keystone-db-sync-pk4nn\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.524023 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvp82\" (UniqueName: \"kubernetes.io/projected/e2bdc381-3838-44ce-bcfc-9cc714973c19-kube-api-access-mvp82\") pod \"neutron-db-create-ph7rn\" (UID: \"e2bdc381-3838-44ce-bcfc-9cc714973c19\") " pod="openstack/neutron-db-create-ph7rn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.618308 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ph7rn" Oct 06 12:25:46 crc kubenswrapper[4892]: I1006 12:25:46.689111 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:25:49 crc kubenswrapper[4892]: W1006 12:25:49.175959 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod725ebf18_e01b_4408_af76_e1c187a5abce.slice/crio-74169174aca2704629fe164c0ee4e479aadca3d064b69d0c2f631a943231bede WatchSource:0}: Error finding container 74169174aca2704629fe164c0ee4e479aadca3d064b69d0c2f631a943231bede: Status 404 returned error can't find the container with id 74169174aca2704629fe164c0ee4e479aadca3d064b69d0c2f631a943231bede Oct 06 12:25:49 crc kubenswrapper[4892]: I1006 12:25:49.911610 4892 generic.go:334] "Generic (PLEG): container finished" podID="725ebf18-e01b-4408-af76-e1c187a5abce" containerID="a1da148e2dcfeae87a224f2b83a3a6b51a059dd6a42d2f478d5553d9c4eff4a8" exitCode=0 Oct 06 12:25:49 crc kubenswrapper[4892]: I1006 12:25:49.911917 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" event={"ID":"725ebf18-e01b-4408-af76-e1c187a5abce","Type":"ContainerDied","Data":"a1da148e2dcfeae87a224f2b83a3a6b51a059dd6a42d2f478d5553d9c4eff4a8"} Oct 06 12:25:49 crc kubenswrapper[4892]: I1006 12:25:49.912453 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" event={"ID":"725ebf18-e01b-4408-af76-e1c187a5abce","Type":"ContainerStarted","Data":"74169174aca2704629fe164c0ee4e479aadca3d064b69d0c2f631a943231bede"} Oct 06 12:25:49 crc kubenswrapper[4892]: I1006 12:25:49.940054 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zqblg"] Oct 06 12:25:49 crc kubenswrapper[4892]: W1006 12:25:49.956087 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d54d8a4_1c52_4e19_a689_e68c7f751bbd.slice/crio-6dc1a2c28d5a7ae98a7ee868c7480b3bdd8601c95d8fbdd66d32563dc968934a WatchSource:0}: Error finding container 6dc1a2c28d5a7ae98a7ee868c7480b3bdd8601c95d8fbdd66d32563dc968934a: Status 404 returned error can't find the container with id 6dc1a2c28d5a7ae98a7ee868c7480b3bdd8601c95d8fbdd66d32563dc968934a Oct 06 12:25:49 crc kubenswrapper[4892]: I1006 12:25:49.963371 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cccks"] Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.116786 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-ph7rn"] Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.130498 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-pk4nn"] Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.132037 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-zjjqh"] Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.922799 4892 generic.go:334] "Generic (PLEG): container finished" podID="2c01bdac-6171-4739-8a4d-93e871a3cbe4" containerID="57a581dbd135bfbd8eca9715c3f4b99506811f2ae5aa9eb68b4773e70f691ea4" exitCode=0 Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.922870 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cccks" event={"ID":"2c01bdac-6171-4739-8a4d-93e871a3cbe4","Type":"ContainerDied","Data":"57a581dbd135bfbd8eca9715c3f4b99506811f2ae5aa9eb68b4773e70f691ea4"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.923269 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cccks" event={"ID":"2c01bdac-6171-4739-8a4d-93e871a3cbe4","Type":"ContainerStarted","Data":"ff1c626b37fbb4a05eba91f69fd47d1b259c9afeb8571e0fefa2aacfa5d45c9f"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.925591 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nvxxf" event={"ID":"21c27657-8048-4b06-9079-4e89e71f369e","Type":"ContainerStarted","Data":"50177a9bcdc642ce507031c7b9da5ffd8bd1086536674d1c74931054aac365c4"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.927668 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" event={"ID":"725ebf18-e01b-4408-af76-e1c187a5abce","Type":"ContainerStarted","Data":"fe18e9b9c621ba56e6aa5c811ad3e1795a7fced9b3def48c4109afffd2aa27d0"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.927760 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.929219 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pk4nn" event={"ID":"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3","Type":"ContainerStarted","Data":"f849ae713534b9814fe6add0d9f9174b0137fb20ac564b17f087fdec7d6d9aef"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.931420 4892 generic.go:334] "Generic (PLEG): container finished" podID="e2bdc381-3838-44ce-bcfc-9cc714973c19" containerID="2a469f1b7a3690e6f78faaeae562c456e62c27df6fbaefb2814211691e64f81a" exitCode=0 Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.931485 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ph7rn" event={"ID":"e2bdc381-3838-44ce-bcfc-9cc714973c19","Type":"ContainerDied","Data":"2a469f1b7a3690e6f78faaeae562c456e62c27df6fbaefb2814211691e64f81a"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.931504 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ph7rn" event={"ID":"e2bdc381-3838-44ce-bcfc-9cc714973c19","Type":"ContainerStarted","Data":"a6ab195c4c10b7da44181d76142f6df459f3e940b64f1f375f7da23e6bac54c7"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.933010 4892 generic.go:334] "Generic (PLEG): container finished" podID="2d54d8a4-1c52-4e19-a689-e68c7f751bbd" containerID="c8e643b16ef17d42c6ea36a2e772d9f84d7dd21f81526165a2090abb5adf5acf" exitCode=0 Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.933066 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zqblg" event={"ID":"2d54d8a4-1c52-4e19-a689-e68c7f751bbd","Type":"ContainerDied","Data":"c8e643b16ef17d42c6ea36a2e772d9f84d7dd21f81526165a2090abb5adf5acf"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.933083 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zqblg" event={"ID":"2d54d8a4-1c52-4e19-a689-e68c7f751bbd","Type":"ContainerStarted","Data":"6dc1a2c28d5a7ae98a7ee868c7480b3bdd8601c95d8fbdd66d32563dc968934a"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.934152 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-zjjqh" event={"ID":"425bb472-054f-4ce7-8788-f63e794dff02","Type":"ContainerStarted","Data":"19b6cf66b592aa8ec8b971236c67dd80f7da55b5a15dcfe80d5f4175d9c99fd2"} Oct 06 12:25:50 crc kubenswrapper[4892]: I1006 12:25:50.982962 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-nvxxf" podStartSLOduration=3.548478512 podStartE2EDuration="17.98294605s" podCreationTimestamp="2025-10-06 12:25:33 +0000 UTC" firstStartedPulling="2025-10-06 12:25:34.964932321 +0000 UTC m=+1021.514638076" lastFinishedPulling="2025-10-06 12:25:49.399399839 +0000 UTC m=+1035.949105614" observedRunningTime="2025-10-06 12:25:50.980242259 +0000 UTC m=+1037.529948044" watchObservedRunningTime="2025-10-06 12:25:50.98294605 +0000 UTC m=+1037.532651815" Oct 06 12:25:51 crc kubenswrapper[4892]: I1006 12:25:51.004781 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" podStartSLOduration=11.004760717 podStartE2EDuration="11.004760717s" podCreationTimestamp="2025-10-06 12:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:25:51.001918381 +0000 UTC m=+1037.551624146" watchObservedRunningTime="2025-10-06 12:25:51.004760717 +0000 UTC m=+1037.554466492" Oct 06 12:25:52 crc kubenswrapper[4892]: I1006 12:25:52.984598 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:25:52 crc kubenswrapper[4892]: I1006 12:25:52.984927 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.930000 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cccks" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.935949 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zqblg" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.943040 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ph7rn" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.977683 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-ph7rn" event={"ID":"e2bdc381-3838-44ce-bcfc-9cc714973c19","Type":"ContainerDied","Data":"a6ab195c4c10b7da44181d76142f6df459f3e940b64f1f375f7da23e6bac54c7"} Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.977730 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6ab195c4c10b7da44181d76142f6df459f3e940b64f1f375f7da23e6bac54c7" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.977799 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-ph7rn" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.980182 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zqblg" event={"ID":"2d54d8a4-1c52-4e19-a689-e68c7f751bbd","Type":"ContainerDied","Data":"6dc1a2c28d5a7ae98a7ee868c7480b3bdd8601c95d8fbdd66d32563dc968934a"} Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.980210 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc1a2c28d5a7ae98a7ee868c7480b3bdd8601c95d8fbdd66d32563dc968934a" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.980257 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zqblg" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.985910 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cccks" event={"ID":"2c01bdac-6171-4739-8a4d-93e871a3cbe4","Type":"ContainerDied","Data":"ff1c626b37fbb4a05eba91f69fd47d1b259c9afeb8571e0fefa2aacfa5d45c9f"} Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.985959 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff1c626b37fbb4a05eba91f69fd47d1b259c9afeb8571e0fefa2aacfa5d45c9f" Oct 06 12:25:53 crc kubenswrapper[4892]: I1006 12:25:53.986008 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cccks" Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.048573 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqzzd\" (UniqueName: \"kubernetes.io/projected/2d54d8a4-1c52-4e19-a689-e68c7f751bbd-kube-api-access-cqzzd\") pod \"2d54d8a4-1c52-4e19-a689-e68c7f751bbd\" (UID: \"2d54d8a4-1c52-4e19-a689-e68c7f751bbd\") " Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.048647 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvp82\" (UniqueName: \"kubernetes.io/projected/e2bdc381-3838-44ce-bcfc-9cc714973c19-kube-api-access-mvp82\") pod \"e2bdc381-3838-44ce-bcfc-9cc714973c19\" (UID: \"e2bdc381-3838-44ce-bcfc-9cc714973c19\") " Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.048786 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n45c5\" (UniqueName: \"kubernetes.io/projected/2c01bdac-6171-4739-8a4d-93e871a3cbe4-kube-api-access-n45c5\") pod \"2c01bdac-6171-4739-8a4d-93e871a3cbe4\" (UID: \"2c01bdac-6171-4739-8a4d-93e871a3cbe4\") " Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.056715 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c01bdac-6171-4739-8a4d-93e871a3cbe4-kube-api-access-n45c5" (OuterVolumeSpecName: "kube-api-access-n45c5") pod "2c01bdac-6171-4739-8a4d-93e871a3cbe4" (UID: "2c01bdac-6171-4739-8a4d-93e871a3cbe4"). InnerVolumeSpecName "kube-api-access-n45c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.056756 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d54d8a4-1c52-4e19-a689-e68c7f751bbd-kube-api-access-cqzzd" (OuterVolumeSpecName: "kube-api-access-cqzzd") pod "2d54d8a4-1c52-4e19-a689-e68c7f751bbd" (UID: "2d54d8a4-1c52-4e19-a689-e68c7f751bbd"). InnerVolumeSpecName "kube-api-access-cqzzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.057053 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bdc381-3838-44ce-bcfc-9cc714973c19-kube-api-access-mvp82" (OuterVolumeSpecName: "kube-api-access-mvp82") pod "e2bdc381-3838-44ce-bcfc-9cc714973c19" (UID: "e2bdc381-3838-44ce-bcfc-9cc714973c19"). InnerVolumeSpecName "kube-api-access-mvp82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.151130 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqzzd\" (UniqueName: \"kubernetes.io/projected/2d54d8a4-1c52-4e19-a689-e68c7f751bbd-kube-api-access-cqzzd\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.151175 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvp82\" (UniqueName: \"kubernetes.io/projected/e2bdc381-3838-44ce-bcfc-9cc714973c19-kube-api-access-mvp82\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:54 crc kubenswrapper[4892]: I1006 12:25:54.151189 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n45c5\" (UniqueName: \"kubernetes.io/projected/2c01bdac-6171-4739-8a4d-93e871a3cbe4-kube-api-access-n45c5\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:55 crc kubenswrapper[4892]: I1006 12:25:55.579637 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:25:55 crc kubenswrapper[4892]: I1006 12:25:55.688616 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-576bc8c5c-zqcrg"] Oct 06 12:25:55 crc kubenswrapper[4892]: I1006 12:25:55.688878 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" podUID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerName="dnsmasq-dns" containerID="cri-o://2a9ee1a37261f8bc47efa6c831feb86237893b5061cba640c7269575eb601583" gracePeriod=10 Oct 06 12:25:55 crc kubenswrapper[4892]: I1006 12:25:55.697222 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" podUID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection reset by peer" Oct 06 12:25:56 crc kubenswrapper[4892]: I1006 12:25:56.027030 4892 generic.go:334] "Generic (PLEG): container finished" podID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerID="2a9ee1a37261f8bc47efa6c831feb86237893b5061cba640c7269575eb601583" exitCode=0 Oct 06 12:25:56 crc kubenswrapper[4892]: I1006 12:25:56.027088 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" event={"ID":"5c260537-1016-427a-a2b8-4a9046dbd3de","Type":"ContainerDied","Data":"2a9ee1a37261f8bc47efa6c831feb86237893b5061cba640c7269575eb601583"} Oct 06 12:25:57 crc kubenswrapper[4892]: I1006 12:25:57.041012 4892 generic.go:334] "Generic (PLEG): container finished" podID="21c27657-8048-4b06-9079-4e89e71f369e" containerID="50177a9bcdc642ce507031c7b9da5ffd8bd1086536674d1c74931054aac365c4" exitCode=0 Oct 06 12:25:57 crc kubenswrapper[4892]: I1006 12:25:57.041056 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nvxxf" event={"ID":"21c27657-8048-4b06-9079-4e89e71f369e","Type":"ContainerDied","Data":"50177a9bcdc642ce507031c7b9da5ffd8bd1086536674d1c74931054aac365c4"} Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.243137 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.339118 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-dns-svc\") pod \"5c260537-1016-427a-a2b8-4a9046dbd3de\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.339284 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcq9j\" (UniqueName: \"kubernetes.io/projected/5c260537-1016-427a-a2b8-4a9046dbd3de-kube-api-access-rcq9j\") pod \"5c260537-1016-427a-a2b8-4a9046dbd3de\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.339366 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-config\") pod \"5c260537-1016-427a-a2b8-4a9046dbd3de\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.339504 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-nb\") pod \"5c260537-1016-427a-a2b8-4a9046dbd3de\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.339947 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-sb\") pod \"5c260537-1016-427a-a2b8-4a9046dbd3de\" (UID: \"5c260537-1016-427a-a2b8-4a9046dbd3de\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.344594 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c260537-1016-427a-a2b8-4a9046dbd3de-kube-api-access-rcq9j" (OuterVolumeSpecName: "kube-api-access-rcq9j") pod "5c260537-1016-427a-a2b8-4a9046dbd3de" (UID: "5c260537-1016-427a-a2b8-4a9046dbd3de"). InnerVolumeSpecName "kube-api-access-rcq9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.396967 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c260537-1016-427a-a2b8-4a9046dbd3de" (UID: "5c260537-1016-427a-a2b8-4a9046dbd3de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.400591 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-config" (OuterVolumeSpecName: "config") pod "5c260537-1016-427a-a2b8-4a9046dbd3de" (UID: "5c260537-1016-427a-a2b8-4a9046dbd3de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.408477 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c260537-1016-427a-a2b8-4a9046dbd3de" (UID: "5c260537-1016-427a-a2b8-4a9046dbd3de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.408632 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c260537-1016-427a-a2b8-4a9046dbd3de" (UID: "5c260537-1016-427a-a2b8-4a9046dbd3de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.442140 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.442181 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcq9j\" (UniqueName: \"kubernetes.io/projected/5c260537-1016-427a-a2b8-4a9046dbd3de-kube-api-access-rcq9j\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.442197 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.442209 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.442221 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c260537-1016-427a-a2b8-4a9046dbd3de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.729102 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.849109 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-config-data\") pod \"21c27657-8048-4b06-9079-4e89e71f369e\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.849189 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v666\" (UniqueName: \"kubernetes.io/projected/21c27657-8048-4b06-9079-4e89e71f369e-kube-api-access-2v666\") pod \"21c27657-8048-4b06-9079-4e89e71f369e\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.849294 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-db-sync-config-data\") pod \"21c27657-8048-4b06-9079-4e89e71f369e\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.849427 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-combined-ca-bundle\") pod \"21c27657-8048-4b06-9079-4e89e71f369e\" (UID: \"21c27657-8048-4b06-9079-4e89e71f369e\") " Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.856406 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c27657-8048-4b06-9079-4e89e71f369e-kube-api-access-2v666" (OuterVolumeSpecName: "kube-api-access-2v666") pod "21c27657-8048-4b06-9079-4e89e71f369e" (UID: "21c27657-8048-4b06-9079-4e89e71f369e"). InnerVolumeSpecName "kube-api-access-2v666". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.858643 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "21c27657-8048-4b06-9079-4e89e71f369e" (UID: "21c27657-8048-4b06-9079-4e89e71f369e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.880095 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21c27657-8048-4b06-9079-4e89e71f369e" (UID: "21c27657-8048-4b06-9079-4e89e71f369e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.934722 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-config-data" (OuterVolumeSpecName: "config-data") pod "21c27657-8048-4b06-9079-4e89e71f369e" (UID: "21c27657-8048-4b06-9079-4e89e71f369e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.953425 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.953630 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.953743 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v666\" (UniqueName: \"kubernetes.io/projected/21c27657-8048-4b06-9079-4e89e71f369e-kube-api-access-2v666\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:58 crc kubenswrapper[4892]: I1006 12:25:58.953852 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c27657-8048-4b06-9079-4e89e71f369e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.090703 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pk4nn" event={"ID":"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3","Type":"ContainerStarted","Data":"e7522be5cdf0eb62acdf83af5048209403247bb5accda3c82d2aeeeea284ed4f"} Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.101270 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" event={"ID":"5c260537-1016-427a-a2b8-4a9046dbd3de","Type":"ContainerDied","Data":"2637a77367972bed90787e337b012d5ee051ea9b6577c14ff9471dde64d50fdd"} Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.101369 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-576bc8c5c-zqcrg" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.101375 4892 scope.go:117] "RemoveContainer" containerID="2a9ee1a37261f8bc47efa6c831feb86237893b5061cba640c7269575eb601583" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.104100 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-zjjqh" event={"ID":"425bb472-054f-4ce7-8788-f63e794dff02","Type":"ContainerStarted","Data":"b83501735257d54d9d2d40c03459e09b3dbfd6e3c193cc6ad510c08d53ca7fd5"} Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.119591 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-pk4nn" podStartSLOduration=5.313144325 podStartE2EDuration="13.119566114s" podCreationTimestamp="2025-10-06 12:25:46 +0000 UTC" firstStartedPulling="2025-10-06 12:25:50.108011478 +0000 UTC m=+1036.657717243" lastFinishedPulling="2025-10-06 12:25:57.914433267 +0000 UTC m=+1044.464139032" observedRunningTime="2025-10-06 12:25:59.106815691 +0000 UTC m=+1045.656521496" watchObservedRunningTime="2025-10-06 12:25:59.119566114 +0000 UTC m=+1045.669271909" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.124086 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nvxxf" event={"ID":"21c27657-8048-4b06-9079-4e89e71f369e","Type":"ContainerDied","Data":"12612ec85e6eed7103d8f1a265ff5d3b35cd481a7d4f593b57fda74808f3ff4e"} Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.124158 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12612ec85e6eed7103d8f1a265ff5d3b35cd481a7d4f593b57fda74808f3ff4e" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.124199 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nvxxf" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.129394 4892 scope.go:117] "RemoveContainer" containerID="41b8e69ae545d953b71d709ff8026dfb58f5b19f07baa3e9635013876c7ab0a2" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.137048 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-zjjqh" podStartSLOduration=6.149653078 podStartE2EDuration="14.13702623s" podCreationTimestamp="2025-10-06 12:25:45 +0000 UTC" firstStartedPulling="2025-10-06 12:25:50.109848564 +0000 UTC m=+1036.659554329" lastFinishedPulling="2025-10-06 12:25:58.097221696 +0000 UTC m=+1044.646927481" observedRunningTime="2025-10-06 12:25:59.127946327 +0000 UTC m=+1045.677652112" watchObservedRunningTime="2025-10-06 12:25:59.13702623 +0000 UTC m=+1045.686732005" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.160841 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-576bc8c5c-zqcrg"] Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.169369 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-576bc8c5c-zqcrg"] Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.479424 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5779c9bcdf-jwmns"] Oct 06 12:25:59 crc kubenswrapper[4892]: E1006 12:25:59.479943 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerName="init" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.479955 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerName="init" Oct 06 12:25:59 crc kubenswrapper[4892]: E1006 12:25:59.479971 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d54d8a4-1c52-4e19-a689-e68c7f751bbd" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.479977 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d54d8a4-1c52-4e19-a689-e68c7f751bbd" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: E1006 12:25:59.479990 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bdc381-3838-44ce-bcfc-9cc714973c19" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.479997 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bdc381-3838-44ce-bcfc-9cc714973c19" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: E1006 12:25:59.480009 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c01bdac-6171-4739-8a4d-93e871a3cbe4" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.480014 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c01bdac-6171-4739-8a4d-93e871a3cbe4" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: E1006 12:25:59.480030 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c27657-8048-4b06-9079-4e89e71f369e" containerName="glance-db-sync" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.480035 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c27657-8048-4b06-9079-4e89e71f369e" containerName="glance-db-sync" Oct 06 12:25:59 crc kubenswrapper[4892]: E1006 12:25:59.480051 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerName="dnsmasq-dns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.480057 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerName="dnsmasq-dns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.480212 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bdc381-3838-44ce-bcfc-9cc714973c19" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.480220 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c27657-8048-4b06-9079-4e89e71f369e" containerName="glance-db-sync" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.480227 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d54d8a4-1c52-4e19-a689-e68c7f751bbd" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.480241 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c01bdac-6171-4739-8a4d-93e871a3cbe4" containerName="mariadb-database-create" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.480254 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c260537-1016-427a-a2b8-4a9046dbd3de" containerName="dnsmasq-dns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.483954 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.503367 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5779c9bcdf-jwmns"] Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.568999 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-svc\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.569051 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-nb\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.569130 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-sb\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.569188 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-config\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.569355 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-swift-storage-0\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.569392 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz628\" (UniqueName: \"kubernetes.io/projected/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-kube-api-access-pz628\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.671418 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-sb\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.671501 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-config\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.671580 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-swift-storage-0\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.671603 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz628\" (UniqueName: \"kubernetes.io/projected/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-kube-api-access-pz628\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.671629 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-svc\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.671649 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-nb\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.672654 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-nb\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.672705 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-sb\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.672710 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-config\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.672710 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-swift-storage-0\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.673065 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-svc\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.693597 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz628\" (UniqueName: \"kubernetes.io/projected/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-kube-api-access-pz628\") pod \"dnsmasq-dns-5779c9bcdf-jwmns\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:25:59 crc kubenswrapper[4892]: I1006 12:25:59.814162 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:26:00 crc kubenswrapper[4892]: I1006 12:26:00.186993 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c260537-1016-427a-a2b8-4a9046dbd3de" path="/var/lib/kubelet/pods/5c260537-1016-427a-a2b8-4a9046dbd3de/volumes" Oct 06 12:26:00 crc kubenswrapper[4892]: I1006 12:26:00.319912 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5779c9bcdf-jwmns"] Oct 06 12:26:00 crc kubenswrapper[4892]: W1006 12:26:00.323469 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac7c4a4_1a0b_45f6_9018_eb3b8acc184b.slice/crio-c53bae80eedcfb6216becfc66b6008a3efd2b56023b236da5f873645c455388f WatchSource:0}: Error finding container c53bae80eedcfb6216becfc66b6008a3efd2b56023b236da5f873645c455388f: Status 404 returned error can't find the container with id c53bae80eedcfb6216becfc66b6008a3efd2b56023b236da5f873645c455388f Oct 06 12:26:01 crc kubenswrapper[4892]: I1006 12:26:01.146004 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" event={"ID":"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b","Type":"ContainerStarted","Data":"2df017aed337480f69e8377ba55b89943bbc48d1290dd63cb5d88a1904a5f9b6"} Oct 06 12:26:01 crc kubenswrapper[4892]: I1006 12:26:01.147134 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" event={"ID":"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b","Type":"ContainerStarted","Data":"c53bae80eedcfb6216becfc66b6008a3efd2b56023b236da5f873645c455388f"} Oct 06 12:26:02 crc kubenswrapper[4892]: I1006 12:26:02.160399 4892 generic.go:334] "Generic (PLEG): container finished" podID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" containerID="2df017aed337480f69e8377ba55b89943bbc48d1290dd63cb5d88a1904a5f9b6" exitCode=0 Oct 06 12:26:02 crc kubenswrapper[4892]: I1006 12:26:02.160474 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" event={"ID":"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b","Type":"ContainerDied","Data":"2df017aed337480f69e8377ba55b89943bbc48d1290dd63cb5d88a1904a5f9b6"} Oct 06 12:26:03 crc kubenswrapper[4892]: I1006 12:26:03.176348 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" event={"ID":"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b","Type":"ContainerStarted","Data":"2395b8cdc2310b8b0c1c1346842e9388f7784f0e86049bb56686921edbe05d7c"} Oct 06 12:26:03 crc kubenswrapper[4892]: I1006 12:26:03.176649 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:26:03 crc kubenswrapper[4892]: I1006 12:26:03.197133 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" podStartSLOduration=4.19711058 podStartE2EDuration="4.19711058s" podCreationTimestamp="2025-10-06 12:25:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:03.195503161 +0000 UTC m=+1049.745208976" watchObservedRunningTime="2025-10-06 12:26:03.19711058 +0000 UTC m=+1049.746816355" Oct 06 12:26:04 crc kubenswrapper[4892]: I1006 12:26:04.188210 4892 generic.go:334] "Generic (PLEG): container finished" podID="425bb472-054f-4ce7-8788-f63e794dff02" containerID="b83501735257d54d9d2d40c03459e09b3dbfd6e3c193cc6ad510c08d53ca7fd5" exitCode=0 Oct 06 12:26:04 crc kubenswrapper[4892]: I1006 12:26:04.188290 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-zjjqh" event={"ID":"425bb472-054f-4ce7-8788-f63e794dff02","Type":"ContainerDied","Data":"b83501735257d54d9d2d40c03459e09b3dbfd6e3c193cc6ad510c08d53ca7fd5"} Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.204168 4892 generic.go:334] "Generic (PLEG): container finished" podID="355f8351-e83b-4a32-83a2-a1c2f3dc9ca3" containerID="e7522be5cdf0eb62acdf83af5048209403247bb5accda3c82d2aeeeea284ed4f" exitCode=0 Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.204252 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pk4nn" event={"ID":"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3","Type":"ContainerDied","Data":"e7522be5cdf0eb62acdf83af5048209403247bb5accda3c82d2aeeeea284ed4f"} Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.726940 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.811110 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-db-sync-config-data\") pod \"425bb472-054f-4ce7-8788-f63e794dff02\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.811285 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/425bb472-054f-4ce7-8788-f63e794dff02-kube-api-access-5kkf6\") pod \"425bb472-054f-4ce7-8788-f63e794dff02\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.811375 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-config-data\") pod \"425bb472-054f-4ce7-8788-f63e794dff02\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.811424 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-combined-ca-bundle\") pod \"425bb472-054f-4ce7-8788-f63e794dff02\" (UID: \"425bb472-054f-4ce7-8788-f63e794dff02\") " Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.820685 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/425bb472-054f-4ce7-8788-f63e794dff02-kube-api-access-5kkf6" (OuterVolumeSpecName: "kube-api-access-5kkf6") pod "425bb472-054f-4ce7-8788-f63e794dff02" (UID: "425bb472-054f-4ce7-8788-f63e794dff02"). InnerVolumeSpecName "kube-api-access-5kkf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.828644 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "425bb472-054f-4ce7-8788-f63e794dff02" (UID: "425bb472-054f-4ce7-8788-f63e794dff02"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.868412 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "425bb472-054f-4ce7-8788-f63e794dff02" (UID: "425bb472-054f-4ce7-8788-f63e794dff02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.890492 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-config-data" (OuterVolumeSpecName: "config-data") pod "425bb472-054f-4ce7-8788-f63e794dff02" (UID: "425bb472-054f-4ce7-8788-f63e794dff02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.912914 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.912953 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kkf6\" (UniqueName: \"kubernetes.io/projected/425bb472-054f-4ce7-8788-f63e794dff02-kube-api-access-5kkf6\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.912967 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:05 crc kubenswrapper[4892]: I1006 12:26:05.912980 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/425bb472-054f-4ce7-8788-f63e794dff02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.037255 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8e32-account-create-fg2wj"] Oct 06 12:26:06 crc kubenswrapper[4892]: E1006 12:26:06.037602 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="425bb472-054f-4ce7-8788-f63e794dff02" containerName="watcher-db-sync" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.037617 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="425bb472-054f-4ce7-8788-f63e794dff02" containerName="watcher-db-sync" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.037797 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="425bb472-054f-4ce7-8788-f63e794dff02" containerName="watcher-db-sync" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.071675 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8e32-account-create-fg2wj"] Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.071853 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e32-account-create-fg2wj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.076114 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.115380 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh7f4\" (UniqueName: \"kubernetes.io/projected/d6b639fd-a918-4284-973b-3dd64770ca40-kube-api-access-xh7f4\") pod \"barbican-8e32-account-create-fg2wj\" (UID: \"d6b639fd-a918-4284-973b-3dd64770ca40\") " pod="openstack/barbican-8e32-account-create-fg2wj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.217671 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh7f4\" (UniqueName: \"kubernetes.io/projected/d6b639fd-a918-4284-973b-3dd64770ca40-kube-api-access-xh7f4\") pod \"barbican-8e32-account-create-fg2wj\" (UID: \"d6b639fd-a918-4284-973b-3dd64770ca40\") " pod="openstack/barbican-8e32-account-create-fg2wj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.249415 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-zjjqh" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.250465 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-zjjqh" event={"ID":"425bb472-054f-4ce7-8788-f63e794dff02","Type":"ContainerDied","Data":"19b6cf66b592aa8ec8b971236c67dd80f7da55b5a15dcfe80d5f4175d9c99fd2"} Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.250514 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b6cf66b592aa8ec8b971236c67dd80f7da55b5a15dcfe80d5f4175d9c99fd2" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.262317 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh7f4\" (UniqueName: \"kubernetes.io/projected/d6b639fd-a918-4284-973b-3dd64770ca40-kube-api-access-xh7f4\") pod \"barbican-8e32-account-create-fg2wj\" (UID: \"d6b639fd-a918-4284-973b-3dd64770ca40\") " pod="openstack/barbican-8e32-account-create-fg2wj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.279626 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-48d7-account-create-7wqqj"] Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.281701 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-48d7-account-create-7wqqj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.285081 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.303343 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-48d7-account-create-7wqqj"] Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.322635 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqdwx\" (UniqueName: \"kubernetes.io/projected/ab9933b0-25cc-4543-8266-1ad5e1fd72ff-kube-api-access-kqdwx\") pod \"cinder-48d7-account-create-7wqqj\" (UID: \"ab9933b0-25cc-4543-8266-1ad5e1fd72ff\") " pod="openstack/cinder-48d7-account-create-7wqqj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.398530 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e32-account-create-fg2wj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.424706 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqdwx\" (UniqueName: \"kubernetes.io/projected/ab9933b0-25cc-4543-8266-1ad5e1fd72ff-kube-api-access-kqdwx\") pod \"cinder-48d7-account-create-7wqqj\" (UID: \"ab9933b0-25cc-4543-8266-1ad5e1fd72ff\") " pod="openstack/cinder-48d7-account-create-7wqqj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.470702 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66cf-account-create-mbggr"] Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.471883 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66cf-account-create-mbggr" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.473488 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqdwx\" (UniqueName: \"kubernetes.io/projected/ab9933b0-25cc-4543-8266-1ad5e1fd72ff-kube-api-access-kqdwx\") pod \"cinder-48d7-account-create-7wqqj\" (UID: \"ab9933b0-25cc-4543-8266-1ad5e1fd72ff\") " pod="openstack/cinder-48d7-account-create-7wqqj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.475881 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.482571 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66cf-account-create-mbggr"] Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.525879 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9ffm\" (UniqueName: \"kubernetes.io/projected/e70aa8f3-809f-4f1d-b8dd-8ecee4996fec-kube-api-access-k9ffm\") pod \"neutron-66cf-account-create-mbggr\" (UID: \"e70aa8f3-809f-4f1d-b8dd-8ecee4996fec\") " pod="openstack/neutron-66cf-account-create-mbggr" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.602703 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-48d7-account-create-7wqqj" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.627504 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9ffm\" (UniqueName: \"kubernetes.io/projected/e70aa8f3-809f-4f1d-b8dd-8ecee4996fec-kube-api-access-k9ffm\") pod \"neutron-66cf-account-create-mbggr\" (UID: \"e70aa8f3-809f-4f1d-b8dd-8ecee4996fec\") " pod="openstack/neutron-66cf-account-create-mbggr" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.650278 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9ffm\" (UniqueName: \"kubernetes.io/projected/e70aa8f3-809f-4f1d-b8dd-8ecee4996fec-kube-api-access-k9ffm\") pod \"neutron-66cf-account-create-mbggr\" (UID: \"e70aa8f3-809f-4f1d-b8dd-8ecee4996fec\") " pod="openstack/neutron-66cf-account-create-mbggr" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.708138 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.728270 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-config-data\") pod \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.728354 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqh9b\" (UniqueName: \"kubernetes.io/projected/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-kube-api-access-dqh9b\") pod \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.728476 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-combined-ca-bundle\") pod \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\" (UID: \"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3\") " Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.747047 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-kube-api-access-dqh9b" (OuterVolumeSpecName: "kube-api-access-dqh9b") pod "355f8351-e83b-4a32-83a2-a1c2f3dc9ca3" (UID: "355f8351-e83b-4a32-83a2-a1c2f3dc9ca3"). InnerVolumeSpecName "kube-api-access-dqh9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.766040 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "355f8351-e83b-4a32-83a2-a1c2f3dc9ca3" (UID: "355f8351-e83b-4a32-83a2-a1c2f3dc9ca3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.781878 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-config-data" (OuterVolumeSpecName: "config-data") pod "355f8351-e83b-4a32-83a2-a1c2f3dc9ca3" (UID: "355f8351-e83b-4a32-83a2-a1c2f3dc9ca3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.830318 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.830361 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.830370 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqh9b\" (UniqueName: \"kubernetes.io/projected/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3-kube-api-access-dqh9b\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.835770 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66cf-account-create-mbggr" Oct 06 12:26:06 crc kubenswrapper[4892]: I1006 12:26:06.913826 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8e32-account-create-fg2wj"] Oct 06 12:26:06 crc kubenswrapper[4892]: W1006 12:26:06.922177 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6b639fd_a918_4284_973b_3dd64770ca40.slice/crio-ac9af71a6976a5d175392ab96e827268baa22ddc077eed74571ff6fd05f7b959 WatchSource:0}: Error finding container ac9af71a6976a5d175392ab96e827268baa22ddc077eed74571ff6fd05f7b959: Status 404 returned error can't find the container with id ac9af71a6976a5d175392ab96e827268baa22ddc077eed74571ff6fd05f7b959 Oct 06 12:26:07 crc kubenswrapper[4892]: W1006 12:26:07.031724 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab9933b0_25cc_4543_8266_1ad5e1fd72ff.slice/crio-ad57c362babbdc9a4bdd5df2565851edfe1c647bdc05296cfacd62691f4fce11 WatchSource:0}: Error finding container ad57c362babbdc9a4bdd5df2565851edfe1c647bdc05296cfacd62691f4fce11: Status 404 returned error can't find the container with id ad57c362babbdc9a4bdd5df2565851edfe1c647bdc05296cfacd62691f4fce11 Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.032615 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-48d7-account-create-7wqqj"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.256594 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66cf-account-create-mbggr"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.263047 4892 generic.go:334] "Generic (PLEG): container finished" podID="ab9933b0-25cc-4543-8266-1ad5e1fd72ff" containerID="fe14ac778c8b08fc05e6b34f35457cb0af828a54d5a9e0885b0a36ccedeb2a37" exitCode=0 Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.263092 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-48d7-account-create-7wqqj" event={"ID":"ab9933b0-25cc-4543-8266-1ad5e1fd72ff","Type":"ContainerDied","Data":"fe14ac778c8b08fc05e6b34f35457cb0af828a54d5a9e0885b0a36ccedeb2a37"} Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.263150 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-48d7-account-create-7wqqj" event={"ID":"ab9933b0-25cc-4543-8266-1ad5e1fd72ff","Type":"ContainerStarted","Data":"ad57c362babbdc9a4bdd5df2565851edfe1c647bdc05296cfacd62691f4fce11"} Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.264945 4892 generic.go:334] "Generic (PLEG): container finished" podID="d6b639fd-a918-4284-973b-3dd64770ca40" containerID="56cc6a8ccdb7232475990d2549c4699fc09caec18349135f904c1dde40002310" exitCode=0 Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.265005 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8e32-account-create-fg2wj" event={"ID":"d6b639fd-a918-4284-973b-3dd64770ca40","Type":"ContainerDied","Data":"56cc6a8ccdb7232475990d2549c4699fc09caec18349135f904c1dde40002310"} Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.265041 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8e32-account-create-fg2wj" event={"ID":"d6b639fd-a918-4284-973b-3dd64770ca40","Type":"ContainerStarted","Data":"ac9af71a6976a5d175392ab96e827268baa22ddc077eed74571ff6fd05f7b959"} Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.269464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-pk4nn" event={"ID":"355f8351-e83b-4a32-83a2-a1c2f3dc9ca3","Type":"ContainerDied","Data":"f849ae713534b9814fe6add0d9f9174b0137fb20ac564b17f087fdec7d6d9aef"} Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.269514 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f849ae713534b9814fe6add0d9f9174b0137fb20ac564b17f087fdec7d6d9aef" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.269517 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-pk4nn" Oct 06 12:26:07 crc kubenswrapper[4892]: W1006 12:26:07.271882 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode70aa8f3_809f_4f1d_b8dd_8ecee4996fec.slice/crio-78ff9528007aadb52d93de6e17e19fc923c423936b746d80db060c6cf7c97a3b WatchSource:0}: Error finding container 78ff9528007aadb52d93de6e17e19fc923c423936b746d80db060c6cf7c97a3b: Status 404 returned error can't find the container with id 78ff9528007aadb52d93de6e17e19fc923c423936b746d80db060c6cf7c97a3b Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.440805 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5779c9bcdf-jwmns"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.441243 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" podUID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" containerName="dnsmasq-dns" containerID="cri-o://2395b8cdc2310b8b0c1c1346842e9388f7784f0e86049bb56686921edbe05d7c" gracePeriod=10 Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.445249 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.463183 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-c55qx"] Oct 06 12:26:07 crc kubenswrapper[4892]: E1006 12:26:07.463556 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355f8351-e83b-4a32-83a2-a1c2f3dc9ca3" containerName="keystone-db-sync" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.463582 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="355f8351-e83b-4a32-83a2-a1c2f3dc9ca3" containerName="keystone-db-sync" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.463746 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="355f8351-e83b-4a32-83a2-a1c2f3dc9ca3" containerName="keystone-db-sync" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.464672 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.471959 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qj9t" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.472125 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.472228 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.486182 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.495659 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c55qx"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.578130 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c67cb9c7-mczvc"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.585746 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.601838 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c67cb9c7-mczvc"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.623586 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.624655 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.660674 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-dswng" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.660909 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663255 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-scripts\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663285 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-sb\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663304 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bd6w\" (UniqueName: \"kubernetes.io/projected/750105b4-a937-4973-94de-6ee9bee54c80-kube-api-access-8bd6w\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663377 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-swift-storage-0\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663418 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-config\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663438 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-config-data\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663456 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-svc\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663500 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-fernet-keys\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663523 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-credential-keys\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663543 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663573 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-combined-ca-bundle\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.663592 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4cpb\" (UniqueName: \"kubernetes.io/projected/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-kube-api-access-t4cpb\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.714553 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.717257 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.722830 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.766911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-svc\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.766969 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.766993 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-fernet-keys\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767010 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767033 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-credential-keys\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767049 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clbzv\" (UniqueName: \"kubernetes.io/projected/6002d110-e634-47ab-b33b-652cbf7b3466-kube-api-access-clbzv\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767064 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767093 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-combined-ca-bundle\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767107 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4cpb\" (UniqueName: \"kubernetes.io/projected/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-kube-api-access-t4cpb\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767155 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-scripts\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767170 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-sb\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767186 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bd6w\" (UniqueName: \"kubernetes.io/projected/750105b4-a937-4973-94de-6ee9bee54c80-kube-api-access-8bd6w\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767210 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767237 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-swift-storage-0\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767270 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d110-e634-47ab-b33b-652cbf7b3466-logs\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767287 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-config\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.767303 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-config-data\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.769024 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-sb\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.769699 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-swift-storage-0\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.771277 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-config\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.771347 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.771298 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-nb\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.772011 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-svc\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.778382 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-combined-ca-bundle\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.781964 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-fernet-keys\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.782567 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-config-data\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.791950 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-scripts\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.795445 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-credential-keys\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.805069 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bd6w\" (UniqueName: \"kubernetes.io/projected/750105b4-a937-4973-94de-6ee9bee54c80-kube-api-access-8bd6w\") pod \"dnsmasq-dns-6c67cb9c7-mczvc\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.805812 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4cpb\" (UniqueName: \"kubernetes.io/projected/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-kube-api-access-t4cpb\") pod \"keystone-bootstrap-c55qx\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.834576 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.835748 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.842615 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.863438 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868560 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-logs\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868620 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868673 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d110-e634-47ab-b33b-652cbf7b3466-logs\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868702 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdpmw\" (UniqueName: \"kubernetes.io/projected/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-kube-api-access-vdpmw\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868718 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868743 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-config-data\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868765 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868787 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868807 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clbzv\" (UniqueName: \"kubernetes.io/projected/6002d110-e634-47ab-b33b-652cbf7b3466-kube-api-access-clbzv\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.868840 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.876859 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.877650 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d110-e634-47ab-b33b-652cbf7b3466-logs\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.882972 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.885131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-config-data\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.900976 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.923039 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clbzv\" (UniqueName: \"kubernetes.io/projected/6002d110-e634-47ab-b33b-652cbf7b3466-kube-api-access-clbzv\") pod \"watcher-decision-engine-0\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.953117 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-686dd487d5-78rg9"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.955194 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.965953 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-4kh6h" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.966122 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.966233 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.966999 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686dd487d5-78rg9"] Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.970756 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdpmw\" (UniqueName: \"kubernetes.io/projected/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-kube-api-access-vdpmw\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.970804 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.970823 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-config-data\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.970860 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.970881 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-config-data\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.970914 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-logs\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.970935 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.970978 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-logs\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.971015 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dsf7\" (UniqueName: \"kubernetes.io/projected/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-kube-api-access-4dsf7\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.971953 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-logs\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.974941 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.982787 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.984936 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.985216 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.985543 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-config-data\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:07 crc kubenswrapper[4892]: I1006 12:26:07.991505 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:07.997098 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.004253 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.004331 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.005190 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdpmw\" (UniqueName: \"kubernetes.io/projected/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-kube-api-access-vdpmw\") pod \"watcher-api-0\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " pod="openstack/watcher-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.027434 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.052735 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jps9b"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.053925 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.055680 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.059645 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.059872 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.060002 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-jjlfz" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.063618 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c67cb9c7-mczvc"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.074447 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.075170 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-config-data\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.075207 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b783419-dc0f-4bac-84fd-043c68de8718-logs\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.075266 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-logs\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.075320 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b783419-dc0f-4bac-84fd-043c68de8718-horizon-secret-key\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.075424 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95dhd\" (UniqueName: \"kubernetes.io/projected/2b783419-dc0f-4bac-84fd-043c68de8718-kube-api-access-95dhd\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.075459 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-scripts\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.075502 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-config-data\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.075524 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dsf7\" (UniqueName: \"kubernetes.io/projected/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-kube-api-access-4dsf7\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.079791 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.080104 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-logs\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.080795 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-config-data\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.081145 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.088304 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.090467 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.093192 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.093506 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.093682 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f6nmd" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.093813 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.096933 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.099850 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jps9b"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.118656 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dsf7\" (UniqueName: \"kubernetes.io/projected/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-kube-api-access-4dsf7\") pod \"watcher-applier-0\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.124774 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.141864 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c68f45bbf-65dm9"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.143357 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.155107 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7976c9f5c7-4g42j"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.160763 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.169715 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c68f45bbf-65dm9"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.178966 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179011 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179044 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179088 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95dhd\" (UniqueName: \"kubernetes.io/projected/2b783419-dc0f-4bac-84fd-043c68de8718-kube-api-access-95dhd\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179116 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179135 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179492 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-scripts\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179531 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-config-data\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179556 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-combined-ca-bundle\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179584 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-config-data\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179619 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179636 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-run-httpd\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179654 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-scripts\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179737 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b783419-dc0f-4bac-84fd-043c68de8718-logs\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179782 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7hbh\" (UniqueName: \"kubernetes.io/projected/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-kube-api-access-t7hbh\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179808 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-scripts\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179829 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179848 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh2hz\" (UniqueName: \"kubernetes.io/projected/f75c5afb-b292-41c0-9e77-b9dc84f38b45-kube-api-access-kh2hz\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179879 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-config-data\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179902 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-log-httpd\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179920 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b783419-dc0f-4bac-84fd-043c68de8718-horizon-secret-key\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179954 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2673cbbe-dc84-4a24-a48a-303029fcc02a-logs\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179972 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8rh\" (UniqueName: \"kubernetes.io/projected/2673cbbe-dc84-4a24-a48a-303029fcc02a-kube-api-access-mr8rh\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.179988 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-logs\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.180895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-scripts\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.181704 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-config-data\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.181970 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b783419-dc0f-4bac-84fd-043c68de8718-logs\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.189171 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b783419-dc0f-4bac-84fd-043c68de8718-horizon-secret-key\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.197226 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95dhd\" (UniqueName: \"kubernetes.io/projected/2b783419-dc0f-4bac-84fd-043c68de8718-kube-api-access-95dhd\") pod \"horizon-686dd487d5-78rg9\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.204904 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7976c9f5c7-4g42j"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.236392 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.240924 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.242844 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.242852 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.261541 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.282848 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-swift-storage-0\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.282911 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70af3608-42f9-456b-9035-3030027e04ca-logs\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.282940 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz648\" (UniqueName: \"kubernetes.io/projected/70af3608-42f9-456b-9035-3030027e04ca-kube-api-access-pz648\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.282965 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-svc\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283512 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-config\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283565 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8dn\" (UniqueName: \"kubernetes.io/projected/d6990a45-0f43-40a0-9c44-54e026d3acd2-kube-api-access-fs8dn\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283650 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7hbh\" (UniqueName: \"kubernetes.io/projected/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-kube-api-access-t7hbh\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283682 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-scripts\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283712 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283726 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh2hz\" (UniqueName: \"kubernetes.io/projected/f75c5afb-b292-41c0-9e77-b9dc84f38b45-kube-api-access-kh2hz\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-config-data\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283799 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-log-httpd\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283824 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283839 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2673cbbe-dc84-4a24-a48a-303029fcc02a-logs\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283858 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8rh\" (UniqueName: \"kubernetes.io/projected/2673cbbe-dc84-4a24-a48a-303029fcc02a-kube-api-access-mr8rh\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283876 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-logs\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283905 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-nb\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283929 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283955 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-config-data\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.283979 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.284018 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.284307 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2673cbbe-dc84-4a24-a48a-303029fcc02a-logs\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287130 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-sb\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287212 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287230 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287255 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-scripts\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287282 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-combined-ca-bundle\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287313 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-config-data\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287363 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70af3608-42f9-456b-9035-3030027e04ca-horizon-secret-key\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287423 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-run-httpd\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.287443 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-scripts\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.289578 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-config-data\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.290316 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-logs\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.290687 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-scripts\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.290991 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.295203 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.295484 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-run-httpd\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.295673 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-log-httpd\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.296864 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.300809 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-scripts\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.301079 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.301652 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.301812 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-config-data\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.302932 4892 generic.go:334] "Generic (PLEG): container finished" podID="e70aa8f3-809f-4f1d-b8dd-8ecee4996fec" containerID="0e79e6fdd772badf78468b0c2aeff4552c4709b13d72819e5b2e3cdd3c33b786" exitCode=0 Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.302998 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66cf-account-create-mbggr" event={"ID":"e70aa8f3-809f-4f1d-b8dd-8ecee4996fec","Type":"ContainerDied","Data":"0e79e6fdd772badf78468b0c2aeff4552c4709b13d72819e5b2e3cdd3c33b786"} Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.303019 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66cf-account-create-mbggr" event={"ID":"e70aa8f3-809f-4f1d-b8dd-8ecee4996fec","Type":"ContainerStarted","Data":"78ff9528007aadb52d93de6e17e19fc923c423936b746d80db060c6cf7c97a3b"} Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.313320 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.314788 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-scripts\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.342193 4892 generic.go:334] "Generic (PLEG): container finished" podID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" containerID="2395b8cdc2310b8b0c1c1346842e9388f7784f0e86049bb56686921edbe05d7c" exitCode=0 Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.342320 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" event={"ID":"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b","Type":"ContainerDied","Data":"2395b8cdc2310b8b0c1c1346842e9388f7784f0e86049bb56686921edbe05d7c"} Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.353893 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.378737 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8rh\" (UniqueName: \"kubernetes.io/projected/2673cbbe-dc84-4a24-a48a-303029fcc02a-kube-api-access-mr8rh\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.378774 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh2hz\" (UniqueName: \"kubernetes.io/projected/f75c5afb-b292-41c0-9e77-b9dc84f38b45-kube-api-access-kh2hz\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.379075 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.379217 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7hbh\" (UniqueName: \"kubernetes.io/projected/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-kube-api-access-t7hbh\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.379897 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-config-data\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.383010 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-combined-ca-bundle\") pod \"placement-db-sync-jps9b\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.390152 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70af3608-42f9-456b-9035-3030027e04ca-logs\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.390423 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz648\" (UniqueName: \"kubernetes.io/projected/70af3608-42f9-456b-9035-3030027e04ca-kube-api-access-pz648\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.390543 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.390655 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-svc\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.390758 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-config\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.390849 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8dn\" (UniqueName: \"kubernetes.io/projected/d6990a45-0f43-40a0-9c44-54e026d3acd2-kube-api-access-fs8dn\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.390928 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.393661 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394105 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-nb\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394150 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-config-data\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394193 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394213 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394233 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-logs\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394259 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394278 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-sb\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394362 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-scripts\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394394 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrvgm\" (UniqueName: \"kubernetes.io/projected/2206d371-8e41-46c6-a255-2ba333463847-kube-api-access-nrvgm\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394434 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70af3608-42f9-456b-9035-3030027e04ca-horizon-secret-key\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.394563 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-swift-storage-0\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.395056 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70af3608-42f9-456b-9035-3030027e04ca-logs\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.395537 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-swift-storage-0\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.396012 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-nb\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.396507 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-sb\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.397040 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-scripts\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.397498 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-svc\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.398482 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-config\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.403600 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-config-data\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.411153 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.413733 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70af3608-42f9-456b-9035-3030027e04ca-horizon-secret-key\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.421337 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.431047 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8dn\" (UniqueName: \"kubernetes.io/projected/d6990a45-0f43-40a0-9c44-54e026d3acd2-kube-api-access-fs8dn\") pod \"dnsmasq-dns-6c68f45bbf-65dm9\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.431060 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz648\" (UniqueName: \"kubernetes.io/projected/70af3608-42f9-456b-9035-3030027e04ca-kube-api-access-pz648\") pod \"horizon-7976c9f5c7-4g42j\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.490235 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.495233 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496034 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrvgm\" (UniqueName: \"kubernetes.io/projected/2206d371-8e41-46c6-a255-2ba333463847-kube-api-access-nrvgm\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496149 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496203 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496223 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496302 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496350 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496397 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-logs\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496510 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.496886 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.497247 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-logs\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.497622 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.503223 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.504134 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.504395 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.505824 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.531048 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.532796 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrvgm\" (UniqueName: \"kubernetes.io/projected/2206d371-8e41-46c6-a255-2ba333463847-kube-api-access-nrvgm\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.583502 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.597806 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-config\") pod \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.597902 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-svc\") pod \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.597924 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-sb\") pod \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.597953 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-nb\") pod \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.597984 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz628\" (UniqueName: \"kubernetes.io/projected/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-kube-api-access-pz628\") pod \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.598052 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-swift-storage-0\") pod \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.624444 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-kube-api-access-pz628" (OuterVolumeSpecName: "kube-api-access-pz628") pod "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" (UID: "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b"). InnerVolumeSpecName "kube-api-access-pz628". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.624866 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.667263 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.699096 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-config" (OuterVolumeSpecName: "config") pod "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" (UID: "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.706799 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz628\" (UniqueName: \"kubernetes.io/projected/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-kube-api-access-pz628\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.706829 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.732022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" (UID: "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.740841 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" (UID: "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.751651 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" (UID: "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.808535 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" (UID: "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.812213 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-svc\") pod \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\" (UID: \"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b\") " Oct 06 12:26:08 crc kubenswrapper[4892]: W1006 12:26:08.812290 4892 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b/volumes/kubernetes.io~configmap/dns-svc Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.812315 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" (UID: "cac7c4a4-1a0b-45f6-9018-eb3b8acc184b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.813142 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.813174 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.813189 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.813201 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:08 crc kubenswrapper[4892]: I1006 12:26:08.861972 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.060585 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e32-account-create-fg2wj" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.112658 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-48d7-account-create-7wqqj" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.123405 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh7f4\" (UniqueName: \"kubernetes.io/projected/d6b639fd-a918-4284-973b-3dd64770ca40-kube-api-access-xh7f4\") pod \"d6b639fd-a918-4284-973b-3dd64770ca40\" (UID: \"d6b639fd-a918-4284-973b-3dd64770ca40\") " Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.131574 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b639fd-a918-4284-973b-3dd64770ca40-kube-api-access-xh7f4" (OuterVolumeSpecName: "kube-api-access-xh7f4") pod "d6b639fd-a918-4284-973b-3dd64770ca40" (UID: "d6b639fd-a918-4284-973b-3dd64770ca40"). InnerVolumeSpecName "kube-api-access-xh7f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.224510 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqdwx\" (UniqueName: \"kubernetes.io/projected/ab9933b0-25cc-4543-8266-1ad5e1fd72ff-kube-api-access-kqdwx\") pod \"ab9933b0-25cc-4543-8266-1ad5e1fd72ff\" (UID: \"ab9933b0-25cc-4543-8266-1ad5e1fd72ff\") " Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.225203 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh7f4\" (UniqueName: \"kubernetes.io/projected/d6b639fd-a918-4284-973b-3dd64770ca40-kube-api-access-xh7f4\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.229517 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9933b0-25cc-4543-8266-1ad5e1fd72ff-kube-api-access-kqdwx" (OuterVolumeSpecName: "kube-api-access-kqdwx") pod "ab9933b0-25cc-4543-8266-1ad5e1fd72ff" (UID: "ab9933b0-25cc-4543-8266-1ad5e1fd72ff"). InnerVolumeSpecName "kube-api-access-kqdwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.276477 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c67cb9c7-mczvc"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.327082 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqdwx\" (UniqueName: \"kubernetes.io/projected/ab9933b0-25cc-4543-8266-1ad5e1fd72ff-kube-api-access-kqdwx\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.381253 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" event={"ID":"cac7c4a4-1a0b-45f6-9018-eb3b8acc184b","Type":"ContainerDied","Data":"c53bae80eedcfb6216becfc66b6008a3efd2b56023b236da5f873645c455388f"} Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.381301 4892 scope.go:117] "RemoveContainer" containerID="2395b8cdc2310b8b0c1c1346842e9388f7784f0e86049bb56686921edbe05d7c" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.381440 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5779c9bcdf-jwmns" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.388562 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-48d7-account-create-7wqqj" event={"ID":"ab9933b0-25cc-4543-8266-1ad5e1fd72ff","Type":"ContainerDied","Data":"ad57c362babbdc9a4bdd5df2565851edfe1c647bdc05296cfacd62691f4fce11"} Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.388602 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad57c362babbdc9a4bdd5df2565851edfe1c647bdc05296cfacd62691f4fce11" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.388581 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-48d7-account-create-7wqqj" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.390078 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8e32-account-create-fg2wj" event={"ID":"d6b639fd-a918-4284-973b-3dd64770ca40","Type":"ContainerDied","Data":"ac9af71a6976a5d175392ab96e827268baa22ddc077eed74571ff6fd05f7b959"} Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.390125 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac9af71a6976a5d175392ab96e827268baa22ddc077eed74571ff6fd05f7b959" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.390163 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8e32-account-create-fg2wj" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.391832 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" event={"ID":"750105b4-a937-4973-94de-6ee9bee54c80","Type":"ContainerStarted","Data":"8c2d3965460dd15cf703397fdcd15b8f59c87c100715381781fb60a204513b25"} Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.425280 4892 scope.go:117] "RemoveContainer" containerID="2df017aed337480f69e8377ba55b89943bbc48d1290dd63cb5d88a1904a5f9b6" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.468113 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5779c9bcdf-jwmns"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.482355 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5779c9bcdf-jwmns"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.529680 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.543855 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:26:09 crc kubenswrapper[4892]: W1006 12:26:09.554933 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6002d110_e634_47ab_b33b_652cbf7b3466.slice/crio-185f3e3e519bd3ba5a5684f2a5af6ffbe2c957e6a288affcf5220fb3bba67df2 WatchSource:0}: Error finding container 185f3e3e519bd3ba5a5684f2a5af6ffbe2c957e6a288affcf5220fb3bba67df2: Status 404 returned error can't find the container with id 185f3e3e519bd3ba5a5684f2a5af6ffbe2c957e6a288affcf5220fb3bba67df2 Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.770072 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.815919 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-686dd487d5-78rg9"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.867074 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6cb478ff49-q7wrl"] Oct 06 12:26:09 crc kubenswrapper[4892]: E1006 12:26:09.867446 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" containerName="dnsmasq-dns" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.867461 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" containerName="dnsmasq-dns" Oct 06 12:26:09 crc kubenswrapper[4892]: E1006 12:26:09.867474 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" containerName="init" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.867481 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" containerName="init" Oct 06 12:26:09 crc kubenswrapper[4892]: E1006 12:26:09.867520 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9933b0-25cc-4543-8266-1ad5e1fd72ff" containerName="mariadb-account-create" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.867527 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9933b0-25cc-4543-8266-1ad5e1fd72ff" containerName="mariadb-account-create" Oct 06 12:26:09 crc kubenswrapper[4892]: E1006 12:26:09.867537 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b639fd-a918-4284-973b-3dd64770ca40" containerName="mariadb-account-create" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.867542 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b639fd-a918-4284-973b-3dd64770ca40" containerName="mariadb-account-create" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.867714 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" containerName="dnsmasq-dns" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.867739 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9933b0-25cc-4543-8266-1ad5e1fd72ff" containerName="mariadb-account-create" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.867751 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b639fd-a918-4284-973b-3dd64770ca40" containerName="mariadb-account-create" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.868690 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.891945 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.903559 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.949627 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7cdbff37-e2f5-4972-b932-6b53278fdaf9-horizon-secret-key\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.952128 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdbff37-e2f5-4972-b932-6b53278fdaf9-logs\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.952392 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-config-data\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.952443 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-scripts\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.952472 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh9mc\" (UniqueName: \"kubernetes.io/projected/7cdbff37-e2f5-4972-b932-6b53278fdaf9-kube-api-access-gh9mc\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.958934 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cb478ff49-q7wrl"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.965107 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66cf-account-create-mbggr" Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.993745 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:09 crc kubenswrapper[4892]: I1006 12:26:09.995474 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-c55qx"] Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.006750 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c68f45bbf-65dm9"] Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.055038 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9ffm\" (UniqueName: \"kubernetes.io/projected/e70aa8f3-809f-4f1d-b8dd-8ecee4996fec-kube-api-access-k9ffm\") pod \"e70aa8f3-809f-4f1d-b8dd-8ecee4996fec\" (UID: \"e70aa8f3-809f-4f1d-b8dd-8ecee4996fec\") " Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.055723 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdbff37-e2f5-4972-b932-6b53278fdaf9-logs\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.055753 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-config-data\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.055782 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-scripts\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.055797 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh9mc\" (UniqueName: \"kubernetes.io/projected/7cdbff37-e2f5-4972-b932-6b53278fdaf9-kube-api-access-gh9mc\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.055939 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7cdbff37-e2f5-4972-b932-6b53278fdaf9-horizon-secret-key\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.059986 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-config-data\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.065205 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-scripts\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.068757 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdbff37-e2f5-4972-b932-6b53278fdaf9-logs\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.070029 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7cdbff37-e2f5-4972-b932-6b53278fdaf9-horizon-secret-key\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.072607 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-686dd487d5-78rg9"] Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.073719 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70aa8f3-809f-4f1d-b8dd-8ecee4996fec-kube-api-access-k9ffm" (OuterVolumeSpecName: "kube-api-access-k9ffm") pod "e70aa8f3-809f-4f1d-b8dd-8ecee4996fec" (UID: "e70aa8f3-809f-4f1d-b8dd-8ecee4996fec"). InnerVolumeSpecName "kube-api-access-k9ffm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.079775 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh9mc\" (UniqueName: \"kubernetes.io/projected/7cdbff37-e2f5-4972-b932-6b53278fdaf9-kube-api-access-gh9mc\") pod \"horizon-6cb478ff49-q7wrl\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.120531 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jps9b"] Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.144295 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7976c9f5c7-4g42j"] Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.161304 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9ffm\" (UniqueName: \"kubernetes.io/projected/e70aa8f3-809f-4f1d-b8dd-8ecee4996fec-kube-api-access-k9ffm\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:10 crc kubenswrapper[4892]: W1006 12:26:10.194165 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2673cbbe_dc84_4a24_a48a_303029fcc02a.slice/crio-63013211bf2e408f56e0437aa51ca2426f81112e28c0331428ae4401bdac2488 WatchSource:0}: Error finding container 63013211bf2e408f56e0437aa51ca2426f81112e28c0331428ae4401bdac2488: Status 404 returned error can't find the container with id 63013211bf2e408f56e0437aa51ca2426f81112e28c0331428ae4401bdac2488 Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.209883 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac7c4a4-1a0b-45f6-9018-eb3b8acc184b" path="/var/lib/kubelet/pods/cac7c4a4-1a0b-45f6-9018-eb3b8acc184b/volumes" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.210858 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.212658 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.236997 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.289184 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.298956 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:10 crc kubenswrapper[4892]: W1006 12:26:10.343657 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2206d371_8e41_46c6_a255_2ba333463847.slice/crio-c189da6d8ac581f2958a2b7a968e3d8995916309cdb7efbd5c4a2fc8df7e7d63 WatchSource:0}: Error finding container c189da6d8ac581f2958a2b7a968e3d8995916309cdb7efbd5c4a2fc8df7e7d63: Status 404 returned error can't find the container with id c189da6d8ac581f2958a2b7a968e3d8995916309cdb7efbd5c4a2fc8df7e7d63 Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.405619 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jps9b" event={"ID":"2673cbbe-dc84-4a24-a48a-303029fcc02a","Type":"ContainerStarted","Data":"63013211bf2e408f56e0437aa51ca2426f81112e28c0331428ae4401bdac2488"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.407061 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerStarted","Data":"cc6f318f1acab0f2ec431777c072253e2a33aa2e5901305138462e824cee1460"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.408139 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976c9f5c7-4g42j" event={"ID":"70af3608-42f9-456b-9035-3030027e04ca","Type":"ContainerStarted","Data":"83ed8ef824a448373ed36d71066d4a81b25a7cf92f35a7239cc4b2514d126824"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.434225 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" event={"ID":"d6990a45-0f43-40a0-9c44-54e026d3acd2","Type":"ContainerStarted","Data":"c6f11bde89413407e738e9ddd15dba55911783aa6da7199f5e53ad030f18d2e6"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.460756 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c55qx" event={"ID":"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c","Type":"ContainerStarted","Data":"d0ab2a5cf351c0d4c0b27a3d9549784735a3569c606695fc7b519f4fddb1184d"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.468192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66cf-account-create-mbggr" event={"ID":"e70aa8f3-809f-4f1d-b8dd-8ecee4996fec","Type":"ContainerDied","Data":"78ff9528007aadb52d93de6e17e19fc923c423936b746d80db060c6cf7c97a3b"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.468233 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78ff9528007aadb52d93de6e17e19fc923c423936b746d80db060c6cf7c97a3b" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.468354 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66cf-account-create-mbggr" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.474885 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686dd487d5-78rg9" event={"ID":"2b783419-dc0f-4bac-84fd-043c68de8718","Type":"ContainerStarted","Data":"525d33dcdf230d0b67fa6b91110498ba0861d1d71e4b28b1614169258a1cb64e"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.489058 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-c55qx" podStartSLOduration=3.489033091 podStartE2EDuration="3.489033091s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:10.475760662 +0000 UTC m=+1057.025466427" watchObservedRunningTime="2025-10-06 12:26:10.489033091 +0000 UTC m=+1057.038738856" Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.502697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12ba9b0a-689f-44c5-b66e-24b53fcad2ad","Type":"ContainerStarted","Data":"2d903d2e622520b94a9154882e414c6a560330607a4b689f518fdbdfb932023f"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.505958 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b7d1bf84-6c40-42b1-8b0f-632c397b0c69","Type":"ContainerStarted","Data":"3d0f2069e808783d48f53a2c122bc0447e1330ffd71d3baf95ec9bde910b7291"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.506004 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b7d1bf84-6c40-42b1-8b0f-632c397b0c69","Type":"ContainerStarted","Data":"a503a8ae4e74b6c04d9c2844bd69a8848ad5819edf9fbd6fb755f515ed5d9b5b"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.508120 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerStarted","Data":"185f3e3e519bd3ba5a5684f2a5af6ffbe2c957e6a288affcf5220fb3bba67df2"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.510042 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2206d371-8e41-46c6-a255-2ba333463847","Type":"ContainerStarted","Data":"c189da6d8ac581f2958a2b7a968e3d8995916309cdb7efbd5c4a2fc8df7e7d63"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.511005 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6b13a195-f0db-44f0-a6f2-5eddc9da4c87","Type":"ContainerStarted","Data":"efa627317e924640ec252a2b519da77e3228e2f29cbccd33957a01ffba8b29c1"} Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.522625 4892 generic.go:334] "Generic (PLEG): container finished" podID="750105b4-a937-4973-94de-6ee9bee54c80" containerID="05efa52a56444f5a13b013e194bf59b7b6b978c73b010957132fc3c96f637ea2" exitCode=0 Oct 06 12:26:10 crc kubenswrapper[4892]: I1006 12:26:10.522703 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" event={"ID":"750105b4-a937-4973-94de-6ee9bee54c80","Type":"ContainerDied","Data":"05efa52a56444f5a13b013e194bf59b7b6b978c73b010957132fc3c96f637ea2"} Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.064578 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6cb478ff49-q7wrl"] Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.138933 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.185228 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-nb\") pod \"750105b4-a937-4973-94de-6ee9bee54c80\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.185267 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-sb\") pod \"750105b4-a937-4973-94de-6ee9bee54c80\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.185329 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bd6w\" (UniqueName: \"kubernetes.io/projected/750105b4-a937-4973-94de-6ee9bee54c80-kube-api-access-8bd6w\") pod \"750105b4-a937-4973-94de-6ee9bee54c80\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.185457 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-config\") pod \"750105b4-a937-4973-94de-6ee9bee54c80\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.185496 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-svc\") pod \"750105b4-a937-4973-94de-6ee9bee54c80\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.185598 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-swift-storage-0\") pod \"750105b4-a937-4973-94de-6ee9bee54c80\" (UID: \"750105b4-a937-4973-94de-6ee9bee54c80\") " Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.203247 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750105b4-a937-4973-94de-6ee9bee54c80-kube-api-access-8bd6w" (OuterVolumeSpecName: "kube-api-access-8bd6w") pod "750105b4-a937-4973-94de-6ee9bee54c80" (UID: "750105b4-a937-4973-94de-6ee9bee54c80"). InnerVolumeSpecName "kube-api-access-8bd6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.237739 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "750105b4-a937-4973-94de-6ee9bee54c80" (UID: "750105b4-a937-4973-94de-6ee9bee54c80"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.251790 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-config" (OuterVolumeSpecName: "config") pod "750105b4-a937-4973-94de-6ee9bee54c80" (UID: "750105b4-a937-4973-94de-6ee9bee54c80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.261245 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "750105b4-a937-4973-94de-6ee9bee54c80" (UID: "750105b4-a937-4973-94de-6ee9bee54c80"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.263232 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "750105b4-a937-4973-94de-6ee9bee54c80" (UID: "750105b4-a937-4973-94de-6ee9bee54c80"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.280875 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "750105b4-a937-4973-94de-6ee9bee54c80" (UID: "750105b4-a937-4973-94de-6ee9bee54c80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.288398 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.288419 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.288428 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.288436 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bd6w\" (UniqueName: \"kubernetes.io/projected/750105b4-a937-4973-94de-6ee9bee54c80-kube-api-access-8bd6w\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.288446 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.288455 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/750105b4-a937-4973-94de-6ee9bee54c80-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.438486 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8t56w"] Oct 06 12:26:11 crc kubenswrapper[4892]: E1006 12:26:11.438885 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70aa8f3-809f-4f1d-b8dd-8ecee4996fec" containerName="mariadb-account-create" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.438903 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70aa8f3-809f-4f1d-b8dd-8ecee4996fec" containerName="mariadb-account-create" Oct 06 12:26:11 crc kubenswrapper[4892]: E1006 12:26:11.438951 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750105b4-a937-4973-94de-6ee9bee54c80" containerName="init" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.438958 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="750105b4-a937-4973-94de-6ee9bee54c80" containerName="init" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.439659 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70aa8f3-809f-4f1d-b8dd-8ecee4996fec" containerName="mariadb-account-create" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.439685 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="750105b4-a937-4973-94de-6ee9bee54c80" containerName="init" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.440278 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.442720 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2wz5s" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.442937 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.468957 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8t56w"] Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.493437 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-combined-ca-bundle\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.493482 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-db-sync-config-data\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.493556 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msbpc\" (UniqueName: \"kubernetes.io/projected/5de602d1-1bde-4049-88a7-d8132dee5d53-kube-api-access-msbpc\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.555522 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rk9bz"] Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.557029 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.558568 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ndb9h" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.559696 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.559976 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.576264 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rk9bz"] Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.589074 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b7d1bf84-6c40-42b1-8b0f-632c397b0c69","Type":"ContainerStarted","Data":"6f533bb06c6d8d36f3078610ea3c6b6832ad62dfc421ca429ab9750b1eea159b"} Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.589248 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api-log" containerID="cri-o://3d0f2069e808783d48f53a2c122bc0447e1330ffd71d3baf95ec9bde910b7291" gracePeriod=30 Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.589688 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api" containerID="cri-o://6f533bb06c6d8d36f3078610ea3c6b6832ad62dfc421ca429ab9750b1eea159b" gracePeriod=30 Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.592963 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.597101 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cb478ff49-q7wrl" event={"ID":"7cdbff37-e2f5-4972-b932-6b53278fdaf9","Type":"ContainerStarted","Data":"6325f6f109cb7e2b285db27d6cc06cb61a8b15a996fd7d7104bf3100d4256e6e"} Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598537 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msbpc\" (UniqueName: \"kubernetes.io/projected/5de602d1-1bde-4049-88a7-d8132dee5d53-kube-api-access-msbpc\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598573 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px9bw\" (UniqueName: \"kubernetes.io/projected/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-kube-api-access-px9bw\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598622 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-db-sync-config-data\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598640 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-combined-ca-bundle\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598666 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-config-data\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598703 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-scripts\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598868 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-etc-machine-id\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598973 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-combined-ca-bundle\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.598995 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-db-sync-config-data\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.599526 4892 generic.go:334] "Generic (PLEG): container finished" podID="d6990a45-0f43-40a0-9c44-54e026d3acd2" containerID="c9d275bdc47003ae34b837f14357b7041049010369b6b360a1e4daa397ed3f91" exitCode=0 Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.599601 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" event={"ID":"d6990a45-0f43-40a0-9c44-54e026d3acd2","Type":"ContainerDied","Data":"c9d275bdc47003ae34b837f14357b7041049010369b6b360a1e4daa397ed3f91"} Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.605138 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c55qx" event={"ID":"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c","Type":"ContainerStarted","Data":"574db5012a0285d7a89376db584b59015d118e027c004d85d5d7c924592b50e0"} Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.614722 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-db-sync-config-data\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.622495 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msbpc\" (UniqueName: \"kubernetes.io/projected/5de602d1-1bde-4049-88a7-d8132dee5d53-kube-api-access-msbpc\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.624849 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-combined-ca-bundle\") pod \"barbican-db-sync-8t56w\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.639597 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" event={"ID":"750105b4-a937-4973-94de-6ee9bee54c80","Type":"ContainerDied","Data":"8c2d3965460dd15cf703397fdcd15b8f59c87c100715381781fb60a204513b25"} Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.639647 4892 scope.go:117] "RemoveContainer" containerID="05efa52a56444f5a13b013e194bf59b7b6b978c73b010957132fc3c96f637ea2" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.639781 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c67cb9c7-mczvc" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.655085 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.655062492 podStartE2EDuration="4.655062492s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:11.630703909 +0000 UTC m=+1058.180409674" watchObservedRunningTime="2025-10-06 12:26:11.655062492 +0000 UTC m=+1058.204768257" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.707035 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-db-sync-config-data\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.707083 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-combined-ca-bundle\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.707130 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-config-data\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.707238 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-scripts\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.707261 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-etc-machine-id\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.707591 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px9bw\" (UniqueName: \"kubernetes.io/projected/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-kube-api-access-px9bw\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.711020 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-knkh9"] Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.711964 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-config-data\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.712171 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-etc-machine-id\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.713300 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.715584 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-combined-ca-bundle\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.716197 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.718743 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-scripts\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.720878 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.721541 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jpd6d" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.736915 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-knkh9"] Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.748619 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-db-sync-config-data\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.749883 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": read tcp 10.217.0.2:45818->10.217.0.154:9322: read: connection reset by peer" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.756492 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px9bw\" (UniqueName: \"kubernetes.io/projected/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-kube-api-access-px9bw\") pod \"cinder-db-sync-rk9bz\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.810071 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-combined-ca-bundle\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.810175 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-config\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.810289 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w26q6\" (UniqueName: \"kubernetes.io/projected/de86e5ee-d52e-4d8b-8077-a0d86175878c-kube-api-access-w26q6\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.815786 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.877994 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.911688 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w26q6\" (UniqueName: \"kubernetes.io/projected/de86e5ee-d52e-4d8b-8077-a0d86175878c-kube-api-access-w26q6\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.911760 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-combined-ca-bundle\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.911796 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-config\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.915719 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-config\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.916046 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-combined-ca-bundle\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.928469 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c67cb9c7-mczvc"] Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.934427 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w26q6\" (UniqueName: \"kubernetes.io/projected/de86e5ee-d52e-4d8b-8077-a0d86175878c-kube-api-access-w26q6\") pod \"neutron-db-sync-knkh9\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:11 crc kubenswrapper[4892]: I1006 12:26:11.954084 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c67cb9c7-mczvc"] Oct 06 12:26:12 crc kubenswrapper[4892]: I1006 12:26:12.183965 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:12 crc kubenswrapper[4892]: I1006 12:26:12.196067 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750105b4-a937-4973-94de-6ee9bee54c80" path="/var/lib/kubelet/pods/750105b4-a937-4973-94de-6ee9bee54c80/volumes" Oct 06 12:26:12 crc kubenswrapper[4892]: I1006 12:26:12.652760 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12ba9b0a-689f-44c5-b66e-24b53fcad2ad","Type":"ContainerStarted","Data":"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d"} Oct 06 12:26:12 crc kubenswrapper[4892]: I1006 12:26:12.660066 4892 generic.go:334] "Generic (PLEG): container finished" podID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerID="6f533bb06c6d8d36f3078610ea3c6b6832ad62dfc421ca429ab9750b1eea159b" exitCode=0 Oct 06 12:26:12 crc kubenswrapper[4892]: I1006 12:26:12.660431 4892 generic.go:334] "Generic (PLEG): container finished" podID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerID="3d0f2069e808783d48f53a2c122bc0447e1330ffd71d3baf95ec9bde910b7291" exitCode=143 Oct 06 12:26:12 crc kubenswrapper[4892]: I1006 12:26:12.660475 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b7d1bf84-6c40-42b1-8b0f-632c397b0c69","Type":"ContainerDied","Data":"6f533bb06c6d8d36f3078610ea3c6b6832ad62dfc421ca429ab9750b1eea159b"} Oct 06 12:26:12 crc kubenswrapper[4892]: I1006 12:26:12.660507 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b7d1bf84-6c40-42b1-8b0f-632c397b0c69","Type":"ContainerDied","Data":"3d0f2069e808783d48f53a2c122bc0447e1330ffd71d3baf95ec9bde910b7291"} Oct 06 12:26:12 crc kubenswrapper[4892]: I1006 12:26:12.663749 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2206d371-8e41-46c6-a255-2ba333463847","Type":"ContainerStarted","Data":"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7"} Oct 06 12:26:13 crc kubenswrapper[4892]: I1006 12:26:13.082152 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 12:26:13 crc kubenswrapper[4892]: I1006 12:26:13.082475 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": dial tcp 10.217.0.154:9322: connect: connection refused" Oct 06 12:26:15 crc kubenswrapper[4892]: I1006 12:26:15.700792 4892 generic.go:334] "Generic (PLEG): container finished" podID="bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" containerID="574db5012a0285d7a89376db584b59015d118e027c004d85d5d7c924592b50e0" exitCode=0 Oct 06 12:26:15 crc kubenswrapper[4892]: I1006 12:26:15.700902 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c55qx" event={"ID":"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c","Type":"ContainerDied","Data":"574db5012a0285d7a89376db584b59015d118e027c004d85d5d7c924592b50e0"} Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.399492 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7976c9f5c7-4g42j"] Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.439914 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7489d9984-82d5x"] Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.441642 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.450913 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.469315 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7489d9984-82d5x"] Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.516210 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b229a3b8-5243-4ec5-8970-d69b61553a4b-logs\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.516464 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wv5k\" (UniqueName: \"kubernetes.io/projected/b229a3b8-5243-4ec5-8970-d69b61553a4b-kube-api-access-5wv5k\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.516583 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-config-data\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.516668 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-secret-key\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.516728 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-combined-ca-bundle\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.516852 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-scripts\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.516926 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-tls-certs\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.516920 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cb478ff49-q7wrl"] Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.542368 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c68c58656-gbbdd"] Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.546225 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.553909 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c68c58656-gbbdd"] Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.618919 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-combined-ca-bundle\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.619226 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-tls-certs\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.620080 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-horizon-secret-key\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.620197 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f038239e-35e8-4409-a858-d7aad410f5fd-logs\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.620443 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww96v\" (UniqueName: \"kubernetes.io/projected/f038239e-35e8-4409-a858-d7aad410f5fd-kube-api-access-ww96v\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.620541 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b229a3b8-5243-4ec5-8970-d69b61553a4b-logs\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.620639 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wv5k\" (UniqueName: \"kubernetes.io/projected/b229a3b8-5243-4ec5-8970-d69b61553a4b-kube-api-access-5wv5k\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.620935 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b229a3b8-5243-4ec5-8970-d69b61553a4b-logs\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.621045 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f038239e-35e8-4409-a858-d7aad410f5fd-scripts\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.621157 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-config-data\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.622100 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-config-data\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.622310 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-horizon-tls-certs\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.622446 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-secret-key\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.622872 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-combined-ca-bundle\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.623041 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f038239e-35e8-4409-a858-d7aad410f5fd-config-data\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.623283 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-scripts\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.623805 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-scripts\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.625251 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-tls-certs\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.626187 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-combined-ca-bundle\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.626486 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-secret-key\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.639271 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wv5k\" (UniqueName: \"kubernetes.io/projected/b229a3b8-5243-4ec5-8970-d69b61553a4b-kube-api-access-5wv5k\") pod \"horizon-7489d9984-82d5x\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.725683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f038239e-35e8-4409-a858-d7aad410f5fd-scripts\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.726095 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-horizon-tls-certs\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.726147 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f038239e-35e8-4409-a858-d7aad410f5fd-config-data\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.726227 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-combined-ca-bundle\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.726250 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-horizon-secret-key\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.726272 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f038239e-35e8-4409-a858-d7aad410f5fd-logs\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.726314 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww96v\" (UniqueName: \"kubernetes.io/projected/f038239e-35e8-4409-a858-d7aad410f5fd-kube-api-access-ww96v\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.726582 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f038239e-35e8-4409-a858-d7aad410f5fd-scripts\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.731263 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f038239e-35e8-4409-a858-d7aad410f5fd-config-data\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.733859 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-combined-ca-bundle\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.733963 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-horizon-secret-key\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.735786 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f038239e-35e8-4409-a858-d7aad410f5fd-horizon-tls-certs\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.739535 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f038239e-35e8-4409-a858-d7aad410f5fd-logs\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.752912 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww96v\" (UniqueName: \"kubernetes.io/projected/f038239e-35e8-4409-a858-d7aad410f5fd-kube-api-access-ww96v\") pod \"horizon-6c68c58656-gbbdd\" (UID: \"f038239e-35e8-4409-a858-d7aad410f5fd\") " pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.773738 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:16 crc kubenswrapper[4892]: I1006 12:26:16.886936 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:17 crc kubenswrapper[4892]: I1006 12:26:17.143231 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8t56w"] Oct 06 12:26:22 crc kubenswrapper[4892]: W1006 12:26:22.101847 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de602d1_1bde_4049_88a7_d8132dee5d53.slice/crio-22262330c23f861a3be1686e2c4b09d75d34d5cbb486f5595bccb97a078af36c WatchSource:0}: Error finding container 22262330c23f861a3be1686e2c4b09d75d34d5cbb486f5595bccb97a078af36c: Status 404 returned error can't find the container with id 22262330c23f861a3be1686e2c4b09d75d34d5cbb486f5595bccb97a078af36c Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.370159 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.383815 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455166 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-credential-keys\") pod \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455221 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-combined-ca-bundle\") pod \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455247 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-custom-prometheus-ca\") pod \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455270 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-fernet-keys\") pod \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455319 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4cpb\" (UniqueName: \"kubernetes.io/projected/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-kube-api-access-t4cpb\") pod \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455371 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-scripts\") pod \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455398 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-config-data\") pod \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455511 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdpmw\" (UniqueName: \"kubernetes.io/projected/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-kube-api-access-vdpmw\") pod \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.455532 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-combined-ca-bundle\") pod \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\" (UID: \"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.456985 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-logs\") pod \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.457069 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-config-data\") pod \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\" (UID: \"b7d1bf84-6c40-42b1-8b0f-632c397b0c69\") " Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.457469 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-logs" (OuterVolumeSpecName: "logs") pod "b7d1bf84-6c40-42b1-8b0f-632c397b0c69" (UID: "b7d1bf84-6c40-42b1-8b0f-632c397b0c69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.457913 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.461630 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-scripts" (OuterVolumeSpecName: "scripts") pod "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" (UID: "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.461704 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" (UID: "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.463875 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" (UID: "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.471957 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-kube-api-access-vdpmw" (OuterVolumeSpecName: "kube-api-access-vdpmw") pod "b7d1bf84-6c40-42b1-8b0f-632c397b0c69" (UID: "b7d1bf84-6c40-42b1-8b0f-632c397b0c69"). InnerVolumeSpecName "kube-api-access-vdpmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.472016 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-kube-api-access-t4cpb" (OuterVolumeSpecName: "kube-api-access-t4cpb") pod "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" (UID: "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c"). InnerVolumeSpecName "kube-api-access-t4cpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.498522 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b7d1bf84-6c40-42b1-8b0f-632c397b0c69" (UID: "b7d1bf84-6c40-42b1-8b0f-632c397b0c69"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.498751 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7d1bf84-6c40-42b1-8b0f-632c397b0c69" (UID: "b7d1bf84-6c40-42b1-8b0f-632c397b0c69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.503352 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-config-data" (OuterVolumeSpecName: "config-data") pod "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" (UID: "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.534808 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" (UID: "bec41a0b-5527-4eb0-b2a4-f0ec0751e95c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.545453 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-config-data" (OuterVolumeSpecName: "config-data") pod "b7d1bf84-6c40-42b1-8b0f-632c397b0c69" (UID: "b7d1bf84-6c40-42b1-8b0f-632c397b0c69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559343 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559385 4892 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559398 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559410 4892 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559422 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559433 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4cpb\" (UniqueName: \"kubernetes.io/projected/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-kube-api-access-t4cpb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559444 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559453 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559463 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.559473 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdpmw\" (UniqueName: \"kubernetes.io/projected/b7d1bf84-6c40-42b1-8b0f-632c397b0c69-kube-api-access-vdpmw\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.719153 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-knkh9"] Oct 06 12:26:22 crc kubenswrapper[4892]: W1006 12:26:22.759780 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde86e5ee_d52e_4d8b_8077_a0d86175878c.slice/crio-3965ee4999104e6e3254165ef652d19371613d91f4066a51e7ece5c231b24633 WatchSource:0}: Error finding container 3965ee4999104e6e3254165ef652d19371613d91f4066a51e7ece5c231b24633: Status 404 returned error can't find the container with id 3965ee4999104e6e3254165ef652d19371613d91f4066a51e7ece5c231b24633 Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.803724 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b7d1bf84-6c40-42b1-8b0f-632c397b0c69","Type":"ContainerDied","Data":"a503a8ae4e74b6c04d9c2844bd69a8848ad5819edf9fbd6fb755f515ed5d9b5b"} Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.803774 4892 scope.go:117] "RemoveContainer" containerID="6f533bb06c6d8d36f3078610ea3c6b6832ad62dfc421ca429ab9750b1eea159b" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.803863 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.821317 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8t56w" event={"ID":"5de602d1-1bde-4049-88a7-d8132dee5d53","Type":"ContainerStarted","Data":"22262330c23f861a3be1686e2c4b09d75d34d5cbb486f5595bccb97a078af36c"} Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.831403 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-knkh9" event={"ID":"de86e5ee-d52e-4d8b-8077-a0d86175878c","Type":"ContainerStarted","Data":"3965ee4999104e6e3254165ef652d19371613d91f4066a51e7ece5c231b24633"} Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.839163 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" event={"ID":"d6990a45-0f43-40a0-9c44-54e026d3acd2","Type":"ContainerStarted","Data":"aa831f1d1cc18eaad85f24e4ffb6f6cda2fba4aad778ae21f357a85050e812fd"} Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.839452 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.852362 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-c55qx" event={"ID":"bec41a0b-5527-4eb0-b2a4-f0ec0751e95c","Type":"ContainerDied","Data":"d0ab2a5cf351c0d4c0b27a3d9549784735a3569c606695fc7b519f4fddb1184d"} Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.852401 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0ab2a5cf351c0d4c0b27a3d9549784735a3569c606695fc7b519f4fddb1184d" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.852468 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-c55qx" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.874269 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.877292 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jps9b" event={"ID":"2673cbbe-dc84-4a24-a48a-303029fcc02a","Type":"ContainerStarted","Data":"f2d055771567f8d238078fe05a09537ba536dbaf1185c3106edc7215a2538356"} Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.884637 4892 scope.go:117] "RemoveContainer" containerID="3d0f2069e808783d48f53a2c122bc0447e1330ffd71d3baf95ec9bde910b7291" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.903082 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.917362 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:22 crc kubenswrapper[4892]: E1006 12:26:22.917795 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api-log" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.917814 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api-log" Oct 06 12:26:22 crc kubenswrapper[4892]: E1006 12:26:22.917833 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.917840 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api" Oct 06 12:26:22 crc kubenswrapper[4892]: E1006 12:26:22.917854 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" containerName="keystone-bootstrap" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.917860 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" containerName="keystone-bootstrap" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.918030 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api-log" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.918059 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" containerName="keystone-bootstrap" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.918071 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.919088 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.923748 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.930988 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.937020 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" podStartSLOduration=15.936983833 podStartE2EDuration="15.936983833s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:22.86940469 +0000 UTC m=+1069.419110465" watchObservedRunningTime="2025-10-06 12:26:22.936983833 +0000 UTC m=+1069.486689598" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.949750 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rk9bz"] Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.951083 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jps9b" podStartSLOduration=3.912639535 podStartE2EDuration="15.951066907s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="2025-10-06 12:26:10.223487452 +0000 UTC m=+1056.773193217" lastFinishedPulling="2025-10-06 12:26:22.261914824 +0000 UTC m=+1068.811620589" observedRunningTime="2025-10-06 12:26:22.899039281 +0000 UTC m=+1069.448745046" watchObservedRunningTime="2025-10-06 12:26:22.951066907 +0000 UTC m=+1069.500772672" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.973637 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvz8\" (UniqueName: \"kubernetes.io/projected/a61c1481-8c3c-490a-97eb-a03156bb7ee5-kube-api-access-wkvz8\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.973707 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.973750 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a61c1481-8c3c-490a-97eb-a03156bb7ee5-logs\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.973790 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.973850 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-config-data\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.984142 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.984182 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.984216 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.984857 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f99cab5f831d4479bae318ede8be6239cf73affb4f0ae80b3e22b31bc2f59223"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.984906 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://f99cab5f831d4479bae318ede8be6239cf73affb4f0ae80b3e22b31bc2f59223" gracePeriod=600 Oct 06 12:26:22 crc kubenswrapper[4892]: I1006 12:26:22.988276 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7489d9984-82d5x"] Oct 06 12:26:23 crc kubenswrapper[4892]: W1006 12:26:23.007925 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb229a3b8_5243_4ec5_8970_d69b61553a4b.slice/crio-0744491d33296a0800e151123d41ec10745a8abeabfa6ccf6920e4684577d078 WatchSource:0}: Error finding container 0744491d33296a0800e151123d41ec10745a8abeabfa6ccf6920e4684577d078: Status 404 returned error can't find the container with id 0744491d33296a0800e151123d41ec10745a8abeabfa6ccf6920e4684577d078 Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.077039 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-config-data\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.077385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvz8\" (UniqueName: \"kubernetes.io/projected/a61c1481-8c3c-490a-97eb-a03156bb7ee5-kube-api-access-wkvz8\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.077452 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.077484 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a61c1481-8c3c-490a-97eb-a03156bb7ee5-logs\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.077519 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.085726 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.154:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.087422 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a61c1481-8c3c-490a-97eb-a03156bb7ee5-logs\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.131280 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c68c58656-gbbdd"] Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.134258 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvz8\" (UniqueName: \"kubernetes.io/projected/a61c1481-8c3c-490a-97eb-a03156bb7ee5-kube-api-access-wkvz8\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.134542 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.134570 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-config-data\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.135092 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.238053 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.476413 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-c55qx"] Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.489906 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-c55qx"] Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.584779 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rqpxp"] Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.586563 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.598205 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.598505 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.598624 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qj9t" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.598721 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.613770 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rqpxp"] Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.700986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-combined-ca-bundle\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.701079 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-credential-keys\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.701372 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-scripts\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.701432 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p9jt\" (UniqueName: \"kubernetes.io/projected/657b347a-9a82-404a-b263-f51befcd5837-kube-api-access-4p9jt\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.701460 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-fernet-keys\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.701483 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-config-data\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.803362 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p9jt\" (UniqueName: \"kubernetes.io/projected/657b347a-9a82-404a-b263-f51befcd5837-kube-api-access-4p9jt\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.803404 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-fernet-keys\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.803428 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-config-data\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.803505 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-combined-ca-bundle\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.803534 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-credential-keys\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.803604 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-scripts\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.823457 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-credential-keys\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.824217 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-fernet-keys\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.824279 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-config-data\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.829685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-scripts\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.832903 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p9jt\" (UniqueName: \"kubernetes.io/projected/657b347a-9a82-404a-b263-f51befcd5837-kube-api-access-4p9jt\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.837101 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-combined-ca-bundle\") pod \"keystone-bootstrap-rqpxp\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.912840 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-knkh9" event={"ID":"de86e5ee-d52e-4d8b-8077-a0d86175878c","Type":"ContainerStarted","Data":"c8649c4a844019867546376f544e67b95ac672156102e7d197d5fe6c9dfd3682"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.924809 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7489d9984-82d5x" event={"ID":"b229a3b8-5243-4ec5-8970-d69b61553a4b","Type":"ContainerStarted","Data":"b929f09ed20951199fcf75f8cafe3036227c5cbc150094f8ea1bac9f3e6ae072"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.924848 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7489d9984-82d5x" event={"ID":"b229a3b8-5243-4ec5-8970-d69b61553a4b","Type":"ContainerStarted","Data":"0744491d33296a0800e151123d41ec10745a8abeabfa6ccf6920e4684577d078"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.930948 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.937825 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6b13a195-f0db-44f0-a6f2-5eddc9da4c87","Type":"ContainerStarted","Data":"a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.940580 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.941364 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerStarted","Data":"e6539d1e7544ee29dc9c8e9e92bb8c80e741283532258168d0521522155d4d52"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.942545 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-knkh9" podStartSLOduration=12.942533026 podStartE2EDuration="12.942533026s" podCreationTimestamp="2025-10-06 12:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:23.93303038 +0000 UTC m=+1070.482736145" watchObservedRunningTime="2025-10-06 12:26:23.942533026 +0000 UTC m=+1070.492238791" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.945631 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="f99cab5f831d4479bae318ede8be6239cf73affb4f0ae80b3e22b31bc2f59223" exitCode=0 Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.945657 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"f99cab5f831d4479bae318ede8be6239cf73affb4f0ae80b3e22b31bc2f59223"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.945709 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"860dd81af7b9279e259a2bd7600f304a9fac68884adcaaf5b381f360c68fdea5"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.945729 4892 scope.go:117] "RemoveContainer" containerID="65a23fa133935013fcfc189b72a8929bb8d601f4fefb20b891097d8ee152e268" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.951648 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=4.797712233 podStartE2EDuration="16.951621079s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="2025-10-06 12:26:10.000860584 +0000 UTC m=+1056.550566349" lastFinishedPulling="2025-10-06 12:26:22.15476942 +0000 UTC m=+1068.704475195" observedRunningTime="2025-10-06 12:26:23.950566137 +0000 UTC m=+1070.500271902" watchObservedRunningTime="2025-10-06 12:26:23.951621079 +0000 UTC m=+1070.501326844" Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.956739 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686dd487d5-78rg9" event={"ID":"2b783419-dc0f-4bac-84fd-043c68de8718","Type":"ContainerStarted","Data":"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.956791 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686dd487d5-78rg9" event={"ID":"2b783419-dc0f-4bac-84fd-043c68de8718","Type":"ContainerStarted","Data":"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.959092 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12ba9b0a-689f-44c5-b66e-24b53fcad2ad","Type":"ContainerStarted","Data":"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.959265 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerName="glance-log" containerID="cri-o://75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d" gracePeriod=30 Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.959581 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerName="glance-httpd" containerID="cri-o://b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854" gracePeriod=30 Oct 06 12:26:23 crc kubenswrapper[4892]: W1006 12:26:23.968357 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda61c1481_8c3c_490a_97eb_a03156bb7ee5.slice/crio-b03ba473369ee8bb2ae82525d3626fc39e4432b1858b6db4708894f23dff7162 WatchSource:0}: Error finding container b03ba473369ee8bb2ae82525d3626fc39e4432b1858b6db4708894f23dff7162: Status 404 returned error can't find the container with id b03ba473369ee8bb2ae82525d3626fc39e4432b1858b6db4708894f23dff7162 Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.968649 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2206d371-8e41-46c6-a255-2ba333463847" containerName="glance-log" containerID="cri-o://13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7" gracePeriod=30 Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.969111 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2206d371-8e41-46c6-a255-2ba333463847","Type":"ContainerStarted","Data":"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.969310 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2206d371-8e41-46c6-a255-2ba333463847" containerName="glance-httpd" containerID="cri-o://15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88" gracePeriod=30 Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.974941 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rk9bz" event={"ID":"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea","Type":"ContainerStarted","Data":"05adee9262e8f58651ca92a0eb84ae20cd9b86888ab99cc52d05448d283d65ab"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.987061 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976c9f5c7-4g42j" event={"ID":"70af3608-42f9-456b-9035-3030027e04ca","Type":"ContainerStarted","Data":"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.993549 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerStarted","Data":"d1e2d76942f9a3ceb5fa669e3e7d9d47b75b8b88c3c082dcdb6f34306b4cb650"} Oct 06 12:26:23 crc kubenswrapper[4892]: I1006 12:26:23.999820 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c68c58656-gbbdd" event={"ID":"f038239e-35e8-4409-a858-d7aad410f5fd","Type":"ContainerStarted","Data":"01323db23a1bf9b599c1a3d3e9d3bc6155ed9a08c9d1dfaef73a46d2bfaa75fb"} Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.007560 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.007543321 podStartE2EDuration="17.007543321s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:23.999010495 +0000 UTC m=+1070.548716260" watchObservedRunningTime="2025-10-06 12:26:24.007543321 +0000 UTC m=+1070.557249086" Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.024011 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cb478ff49-q7wrl" event={"ID":"7cdbff37-e2f5-4972-b932-6b53278fdaf9","Type":"ContainerStarted","Data":"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32"} Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.024054 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cb478ff49-q7wrl" event={"ID":"7cdbff37-e2f5-4972-b932-6b53278fdaf9","Type":"ContainerStarted","Data":"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b"} Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.024630 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6cb478ff49-q7wrl" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerName="horizon-log" containerID="cri-o://1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b" gracePeriod=30 Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.024914 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6cb478ff49-q7wrl" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerName="horizon" containerID="cri-o://48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32" gracePeriod=30 Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.029227 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.029208133 podStartE2EDuration="16.029208133s" podCreationTimestamp="2025-10-06 12:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:24.023847002 +0000 UTC m=+1070.573552767" watchObservedRunningTime="2025-10-06 12:26:24.029208133 +0000 UTC m=+1070.578913898" Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.041852 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=4.422884105 podStartE2EDuration="17.041833113s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="2025-10-06 12:26:09.557181765 +0000 UTC m=+1056.106887530" lastFinishedPulling="2025-10-06 12:26:22.176130773 +0000 UTC m=+1068.725836538" observedRunningTime="2025-10-06 12:26:24.040128002 +0000 UTC m=+1070.589833757" watchObservedRunningTime="2025-10-06 12:26:24.041833113 +0000 UTC m=+1070.591538868" Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.068827 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6cb478ff49-q7wrl" podStartSLOduration=3.753637803 podStartE2EDuration="15.068802515s" podCreationTimestamp="2025-10-06 12:26:09 +0000 UTC" firstStartedPulling="2025-10-06 12:26:11.09461815 +0000 UTC m=+1057.644323915" lastFinishedPulling="2025-10-06 12:26:22.409782862 +0000 UTC m=+1068.959488627" observedRunningTime="2025-10-06 12:26:24.05469548 +0000 UTC m=+1070.604401245" watchObservedRunningTime="2025-10-06 12:26:24.068802515 +0000 UTC m=+1070.618508290" Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.200069 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d1bf84-6c40-42b1-8b0f-632c397b0c69" path="/var/lib/kubelet/pods/b7d1bf84-6c40-42b1-8b0f-632c397b0c69/volumes" Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.200673 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bec41a0b-5527-4eb0-b2a4-f0ec0751e95c" path="/var/lib/kubelet/pods/bec41a0b-5527-4eb0-b2a4-f0ec0751e95c/volumes" Oct 06 12:26:24 crc kubenswrapper[4892]: I1006 12:26:24.995571 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rqpxp"] Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.031632 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.037631 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.087890 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c68c58656-gbbdd" event={"ID":"f038239e-35e8-4409-a858-d7aad410f5fd","Type":"ContainerStarted","Data":"370de9cc2aeef48ac2313f8264c42d0968e1428dc1d8fb38daa5f76508ca0846"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.087936 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c68c58656-gbbdd" event={"ID":"f038239e-35e8-4409-a858-d7aad410f5fd","Type":"ContainerStarted","Data":"2808bdb2a55d99f14a0f50d06ff73f151a34f37571d59ccf53a6997c89d52120"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.095380 4892 generic.go:334] "Generic (PLEG): container finished" podID="2206d371-8e41-46c6-a255-2ba333463847" containerID="15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88" exitCode=0 Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.095415 4892 generic.go:334] "Generic (PLEG): container finished" podID="2206d371-8e41-46c6-a255-2ba333463847" containerID="13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7" exitCode=143 Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.095465 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2206d371-8e41-46c6-a255-2ba333463847","Type":"ContainerDied","Data":"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.095496 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2206d371-8e41-46c6-a255-2ba333463847","Type":"ContainerDied","Data":"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.095519 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2206d371-8e41-46c6-a255-2ba333463847","Type":"ContainerDied","Data":"c189da6d8ac581f2958a2b7a968e3d8995916309cdb7efbd5c4a2fc8df7e7d63"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.095538 4892 scope.go:117] "RemoveContainer" containerID="15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.095821 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.113469 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7489d9984-82d5x" event={"ID":"b229a3b8-5243-4ec5-8970-d69b61553a4b","Type":"ContainerStarted","Data":"18eac723022a92e25d1293baa3888a63d89e6f7b88835b26ef947985f501f48b"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.138140 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c68c58656-gbbdd" podStartSLOduration=9.138117036 podStartE2EDuration="9.138117036s" podCreationTimestamp="2025-10-06 12:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:25.136732884 +0000 UTC m=+1071.686438649" watchObservedRunningTime="2025-10-06 12:26:25.138117036 +0000 UTC m=+1071.687822811" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.168163 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7489d9984-82d5x" podStartSLOduration=9.168142979 podStartE2EDuration="9.168142979s" podCreationTimestamp="2025-10-06 12:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:25.153702114 +0000 UTC m=+1071.703407879" watchObservedRunningTime="2025-10-06 12:26:25.168142979 +0000 UTC m=+1071.717848744" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173078 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7hbh\" (UniqueName: \"kubernetes.io/projected/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-kube-api-access-t7hbh\") pod \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173129 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-scripts\") pod \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173178 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-combined-ca-bundle\") pod \"2206d371-8e41-46c6-a255-2ba333463847\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173211 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-combined-ca-bundle\") pod \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173282 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-httpd-run\") pod \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173345 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-public-tls-certs\") pod \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173414 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-logs\") pod \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173443 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-scripts\") pod \"2206d371-8e41-46c6-a255-2ba333463847\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173462 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173484 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-logs\") pod \"2206d371-8e41-46c6-a255-2ba333463847\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173505 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-httpd-run\") pod \"2206d371-8e41-46c6-a255-2ba333463847\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173527 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2206d371-8e41-46c6-a255-2ba333463847\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173550 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-config-data\") pod \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\" (UID: \"12ba9b0a-689f-44c5-b66e-24b53fcad2ad\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173588 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-config-data\") pod \"2206d371-8e41-46c6-a255-2ba333463847\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173617 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-internal-tls-certs\") pod \"2206d371-8e41-46c6-a255-2ba333463847\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.173649 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrvgm\" (UniqueName: \"kubernetes.io/projected/2206d371-8e41-46c6-a255-2ba333463847-kube-api-access-nrvgm\") pod \"2206d371-8e41-46c6-a255-2ba333463847\" (UID: \"2206d371-8e41-46c6-a255-2ba333463847\") " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.180124 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2206d371-8e41-46c6-a255-2ba333463847" (UID: "2206d371-8e41-46c6-a255-2ba333463847"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.181712 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-kube-api-access-t7hbh" (OuterVolumeSpecName: "kube-api-access-t7hbh") pod "12ba9b0a-689f-44c5-b66e-24b53fcad2ad" (UID: "12ba9b0a-689f-44c5-b66e-24b53fcad2ad"). InnerVolumeSpecName "kube-api-access-t7hbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.183664 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a61c1481-8c3c-490a-97eb-a03156bb7ee5","Type":"ContainerStarted","Data":"4cc36948b06fe9f745c389827ff6fcac12829016f8122d8ccb102e44c82f19e1"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.183698 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a61c1481-8c3c-490a-97eb-a03156bb7ee5","Type":"ContainerStarted","Data":"b03ba473369ee8bb2ae82525d3626fc39e4432b1858b6db4708894f23dff7162"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.184847 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.184872 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-logs" (OuterVolumeSpecName: "logs") pod "2206d371-8e41-46c6-a255-2ba333463847" (UID: "2206d371-8e41-46c6-a255-2ba333463847"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.185261 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "12ba9b0a-689f-44c5-b66e-24b53fcad2ad" (UID: "12ba9b0a-689f-44c5-b66e-24b53fcad2ad"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.188515 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-scripts" (OuterVolumeSpecName: "scripts") pod "12ba9b0a-689f-44c5-b66e-24b53fcad2ad" (UID: "12ba9b0a-689f-44c5-b66e-24b53fcad2ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.189509 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": dial tcp 10.217.0.169:9322: connect: connection refused" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.190906 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-logs" (OuterVolumeSpecName: "logs") pod "12ba9b0a-689f-44c5-b66e-24b53fcad2ad" (UID: "12ba9b0a-689f-44c5-b66e-24b53fcad2ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.193088 4892 generic.go:334] "Generic (PLEG): container finished" podID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerID="b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854" exitCode=0 Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.193115 4892 generic.go:334] "Generic (PLEG): container finished" podID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerID="75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d" exitCode=143 Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.193169 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12ba9b0a-689f-44c5-b66e-24b53fcad2ad","Type":"ContainerDied","Data":"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.193195 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12ba9b0a-689f-44c5-b66e-24b53fcad2ad","Type":"ContainerDied","Data":"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.193204 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"12ba9b0a-689f-44c5-b66e-24b53fcad2ad","Type":"ContainerDied","Data":"2d903d2e622520b94a9154882e414c6a560330607a4b689f518fdbdfb932023f"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.193303 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.195484 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-scripts" (OuterVolumeSpecName: "scripts") pod "2206d371-8e41-46c6-a255-2ba333463847" (UID: "2206d371-8e41-46c6-a255-2ba333463847"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.205608 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "12ba9b0a-689f-44c5-b66e-24b53fcad2ad" (UID: "12ba9b0a-689f-44c5-b66e-24b53fcad2ad"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.212557 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2206d371-8e41-46c6-a255-2ba333463847-kube-api-access-nrvgm" (OuterVolumeSpecName: "kube-api-access-nrvgm") pod "2206d371-8e41-46c6-a255-2ba333463847" (UID: "2206d371-8e41-46c6-a255-2ba333463847"). InnerVolumeSpecName "kube-api-access-nrvgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.218474 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "2206d371-8e41-46c6-a255-2ba333463847" (UID: "2206d371-8e41-46c6-a255-2ba333463847"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.225655 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976c9f5c7-4g42j" event={"ID":"70af3608-42f9-456b-9035-3030027e04ca","Type":"ContainerStarted","Data":"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d"} Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.225780 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7976c9f5c7-4g42j" podUID="70af3608-42f9-456b-9035-3030027e04ca" containerName="horizon-log" containerID="cri-o://e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce" gracePeriod=30 Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.226257 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7976c9f5c7-4g42j" podUID="70af3608-42f9-456b-9035-3030027e04ca" containerName="horizon" containerID="cri-o://19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d" gracePeriod=30 Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.226396 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-686dd487d5-78rg9" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" containerName="horizon-log" containerID="cri-o://f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6" gracePeriod=30 Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.227200 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-686dd487d5-78rg9" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" containerName="horizon" containerID="cri-o://8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870" gracePeriod=30 Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.246463 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ba9b0a-689f-44c5-b66e-24b53fcad2ad" (UID: "12ba9b0a-689f-44c5-b66e-24b53fcad2ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.252600 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.252583919 podStartE2EDuration="3.252583919s" podCreationTimestamp="2025-10-06 12:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:25.208651098 +0000 UTC m=+1071.758356863" watchObservedRunningTime="2025-10-06 12:26:25.252583919 +0000 UTC m=+1071.802289684" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282579 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282609 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282628 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282637 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282646 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2206d371-8e41-46c6-a255-2ba333463847-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282660 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282669 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrvgm\" (UniqueName: \"kubernetes.io/projected/2206d371-8e41-46c6-a255-2ba333463847-kube-api-access-nrvgm\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282677 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7hbh\" (UniqueName: \"kubernetes.io/projected/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-kube-api-access-t7hbh\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282685 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282695 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.282703 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.298495 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-686dd487d5-78rg9" podStartSLOduration=5.952428414 podStartE2EDuration="18.29847512s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="2025-10-06 12:26:10.060313053 +0000 UTC m=+1056.610018818" lastFinishedPulling="2025-10-06 12:26:22.406359759 +0000 UTC m=+1068.956065524" observedRunningTime="2025-10-06 12:26:25.277753847 +0000 UTC m=+1071.827459632" watchObservedRunningTime="2025-10-06 12:26:25.29847512 +0000 UTC m=+1071.848180885" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.300261 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7976c9f5c7-4g42j" podStartSLOduration=6.111176961 podStartE2EDuration="18.300253524s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="2025-10-06 12:26:10.222954676 +0000 UTC m=+1056.772660441" lastFinishedPulling="2025-10-06 12:26:22.412031239 +0000 UTC m=+1068.961737004" observedRunningTime="2025-10-06 12:26:25.247899988 +0000 UTC m=+1071.797605753" watchObservedRunningTime="2025-10-06 12:26:25.300253524 +0000 UTC m=+1071.849959289" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.323690 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.374444 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2206d371-8e41-46c6-a255-2ba333463847" (UID: "2206d371-8e41-46c6-a255-2ba333463847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.374476 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2206d371-8e41-46c6-a255-2ba333463847" (UID: "2206d371-8e41-46c6-a255-2ba333463847"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.381119 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.384933 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.391367 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.391464 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.391544 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.391753 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-config-data" (OuterVolumeSpecName: "config-data") pod "12ba9b0a-689f-44c5-b66e-24b53fcad2ad" (UID: "12ba9b0a-689f-44c5-b66e-24b53fcad2ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.401489 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "12ba9b0a-689f-44c5-b66e-24b53fcad2ad" (UID: "12ba9b0a-689f-44c5-b66e-24b53fcad2ad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.424750 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-config-data" (OuterVolumeSpecName: "config-data") pod "2206d371-8e41-46c6-a255-2ba333463847" (UID: "2206d371-8e41-46c6-a255-2ba333463847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.493640 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.493686 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba9b0a-689f-44c5-b66e-24b53fcad2ad-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.493695 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2206d371-8e41-46c6-a255-2ba333463847-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.571121 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.584168 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.604228 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:25 crc kubenswrapper[4892]: E1006 12:26:25.606253 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerName="glance-log" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.606303 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerName="glance-log" Oct 06 12:26:25 crc kubenswrapper[4892]: E1006 12:26:25.606333 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2206d371-8e41-46c6-a255-2ba333463847" containerName="glance-log" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.606339 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2206d371-8e41-46c6-a255-2ba333463847" containerName="glance-log" Oct 06 12:26:25 crc kubenswrapper[4892]: E1006 12:26:25.606358 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerName="glance-httpd" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.606363 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerName="glance-httpd" Oct 06 12:26:25 crc kubenswrapper[4892]: E1006 12:26:25.606377 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2206d371-8e41-46c6-a255-2ba333463847" containerName="glance-httpd" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.606383 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2206d371-8e41-46c6-a255-2ba333463847" containerName="glance-httpd" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.606571 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerName="glance-httpd" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.606585 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" containerName="glance-log" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.606594 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2206d371-8e41-46c6-a255-2ba333463847" containerName="glance-httpd" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.606615 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2206d371-8e41-46c6-a255-2ba333463847" containerName="glance-log" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.607767 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.611604 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.612507 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.617180 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.697496 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.697557 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.697589 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.697742 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.697792 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-logs\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.697834 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.697874 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.697959 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lkd8\" (UniqueName: \"kubernetes.io/projected/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-kube-api-access-6lkd8\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.746910 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.763963 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.775767 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.777836 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.781715 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.781846 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.792833 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.802093 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lkd8\" (UniqueName: \"kubernetes.io/projected/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-kube-api-access-6lkd8\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.802175 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.802221 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.802252 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.802299 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.802336 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-logs\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.802361 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.802386 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.803746 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.805847 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-logs\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.806764 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.808824 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-config-data\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.809973 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.812727 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.825882 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lkd8\" (UniqueName: \"kubernetes.io/projected/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-kube-api-access-6lkd8\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.835113 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-scripts\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.848543 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " pod="openstack/glance-default-external-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.903557 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.903625 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.903684 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.903701 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.903724 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.903748 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9mw\" (UniqueName: \"kubernetes.io/projected/ff0622c4-d0fd-4732-9418-58d60d081887-kube-api-access-vq9mw\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.903764 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.903794 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:25 crc kubenswrapper[4892]: I1006 12:26:25.930426 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.005399 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.005472 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.005533 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.005552 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.005581 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.005613 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9mw\" (UniqueName: \"kubernetes.io/projected/ff0622c4-d0fd-4732-9418-58d60d081887-kube-api-access-vq9mw\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.005638 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.005677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.006197 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.006584 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.006815 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.016016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.035892 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.035902 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.039867 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9mw\" (UniqueName: \"kubernetes.io/projected/ff0622c4-d0fd-4732-9418-58d60d081887-kube-api-access-vq9mw\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.049305 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.097493 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.183394 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ba9b0a-689f-44c5-b66e-24b53fcad2ad" path="/var/lib/kubelet/pods/12ba9b0a-689f-44c5-b66e-24b53fcad2ad/volumes" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.185259 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2206d371-8e41-46c6-a255-2ba333463847" path="/var/lib/kubelet/pods/2206d371-8e41-46c6-a255-2ba333463847/volumes" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.248367 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a61c1481-8c3c-490a-97eb-a03156bb7ee5","Type":"ContainerStarted","Data":"c75ea41f40beccff9afaf15c1b9ce11f7c8a4e08308724cd8eb9bbb791b3367b"} Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.401742 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.513469 4892 scope.go:117] "RemoveContainer" containerID="13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.751037 4892 scope.go:117] "RemoveContainer" containerID="15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88" Oct 06 12:26:26 crc kubenswrapper[4892]: E1006 12:26:26.751845 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88\": container with ID starting with 15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88 not found: ID does not exist" containerID="15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.751895 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88"} err="failed to get container status \"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88\": rpc error: code = NotFound desc = could not find container \"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88\": container with ID starting with 15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88 not found: ID does not exist" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.751934 4892 scope.go:117] "RemoveContainer" containerID="13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7" Oct 06 12:26:26 crc kubenswrapper[4892]: E1006 12:26:26.752279 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7\": container with ID starting with 13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7 not found: ID does not exist" containerID="13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.752307 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7"} err="failed to get container status \"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7\": rpc error: code = NotFound desc = could not find container \"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7\": container with ID starting with 13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7 not found: ID does not exist" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.752372 4892 scope.go:117] "RemoveContainer" containerID="15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.752640 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88"} err="failed to get container status \"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88\": rpc error: code = NotFound desc = could not find container \"15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88\": container with ID starting with 15c7f1dafe41ea9b8f8231bb3742bb8e84279d88ddab34aa8193968aa876da88 not found: ID does not exist" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.752665 4892 scope.go:117] "RemoveContainer" containerID="13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.752854 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7"} err="failed to get container status \"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7\": rpc error: code = NotFound desc = could not find container \"13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7\": container with ID starting with 13b6d402088a513b5fed8bc1a9d52033b84a5685e31950a30e2b4201a102d0b7 not found: ID does not exist" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.752872 4892 scope.go:117] "RemoveContainer" containerID="b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.774909 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.774988 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.824775 4892 scope.go:117] "RemoveContainer" containerID="75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.869165 4892 scope.go:117] "RemoveContainer" containerID="b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854" Oct 06 12:26:26 crc kubenswrapper[4892]: E1006 12:26:26.872164 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854\": container with ID starting with b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854 not found: ID does not exist" containerID="b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.872202 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854"} err="failed to get container status \"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854\": rpc error: code = NotFound desc = could not find container \"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854\": container with ID starting with b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854 not found: ID does not exist" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.872228 4892 scope.go:117] "RemoveContainer" containerID="75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d" Oct 06 12:26:26 crc kubenswrapper[4892]: E1006 12:26:26.872547 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d\": container with ID starting with 75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d not found: ID does not exist" containerID="75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.872564 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d"} err="failed to get container status \"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d\": rpc error: code = NotFound desc = could not find container \"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d\": container with ID starting with 75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d not found: ID does not exist" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.872577 4892 scope.go:117] "RemoveContainer" containerID="b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.872934 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854"} err="failed to get container status \"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854\": rpc error: code = NotFound desc = could not find container \"b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854\": container with ID starting with b5bb663899075e6617525c0c3509eb7b9bed0ceb8c8a0b2239af4125c9ac7854 not found: ID does not exist" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.872948 4892 scope.go:117] "RemoveContainer" containerID="75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.873935 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d"} err="failed to get container status \"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d\": rpc error: code = NotFound desc = could not find container \"75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d\": container with ID starting with 75df37a9e22ced23bfbb0256a3b128b27431d1750ea2221601d2c1b466310d0d not found: ID does not exist" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.887995 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:26 crc kubenswrapper[4892]: I1006 12:26:26.888038 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:27 crc kubenswrapper[4892]: I1006 12:26:27.239674 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:26:27 crc kubenswrapper[4892]: I1006 12:26:27.257654 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rqpxp" event={"ID":"657b347a-9a82-404a-b263-f51befcd5837","Type":"ContainerStarted","Data":"3c2a02d34a6fd831789c5c2c29b8643e0b8a93f9f084d92901abee5e94d3a7d3"} Oct 06 12:26:27 crc kubenswrapper[4892]: I1006 12:26:27.257870 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rqpxp" event={"ID":"657b347a-9a82-404a-b263-f51befcd5837","Type":"ContainerStarted","Data":"66353aa284a28751ffd560e8b5b8011d58b90f09f45d070f576a894d8881c62c"} Oct 06 12:26:27 crc kubenswrapper[4892]: I1006 12:26:27.260264 4892 generic.go:334] "Generic (PLEG): container finished" podID="2673cbbe-dc84-4a24-a48a-303029fcc02a" containerID="f2d055771567f8d238078fe05a09537ba536dbaf1185c3106edc7215a2538356" exitCode=0 Oct 06 12:26:27 crc kubenswrapper[4892]: I1006 12:26:27.260305 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jps9b" event={"ID":"2673cbbe-dc84-4a24-a48a-303029fcc02a","Type":"ContainerDied","Data":"f2d055771567f8d238078fe05a09537ba536dbaf1185c3106edc7215a2538356"} Oct 06 12:26:27 crc kubenswrapper[4892]: I1006 12:26:27.271471 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerStarted","Data":"2e0bc68cd17eb8cdf12f45ea8fa27aaae9d3ddd95d93952eb9f3534f362600b0"} Oct 06 12:26:27 crc kubenswrapper[4892]: I1006 12:26:27.284051 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rqpxp" podStartSLOduration=4.284036496 podStartE2EDuration="4.284036496s" podCreationTimestamp="2025-10-06 12:26:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:27.276269083 +0000 UTC m=+1073.825974848" watchObservedRunningTime="2025-10-06 12:26:27.284036496 +0000 UTC m=+1073.833742261" Oct 06 12:26:27 crc kubenswrapper[4892]: I1006 12:26:27.388766 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.056698 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.120983 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.239182 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.297556 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.297618 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.313510 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.341263 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.345496 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff0622c4-d0fd-4732-9418-58d60d081887","Type":"ContainerStarted","Data":"b68b173207667e8ecfc2b1fda1ce77722b03d173575c781ce659aa479a985d2e"} Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.345544 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff0622c4-d0fd-4732-9418-58d60d081887","Type":"ContainerStarted","Data":"0c21654092ab3ed0763f730b4898335e595386df4d748289b1cb52954f75d3d4"} Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.356485 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875","Type":"ContainerStarted","Data":"646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5"} Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.356528 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875","Type":"ContainerStarted","Data":"a88b903cabb40e74dfef5426fd2b09ac76cfc9de1f1f740238dd8eae8ce1b57c"} Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.356914 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.357092 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.409241 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" probeResult="failure" output="" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.497904 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.504003 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.548052 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55ddd64775-5m4mj"] Oct 06 12:26:28 crc kubenswrapper[4892]: I1006 12:26:28.553383 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" podUID="725ebf18-e01b-4408-af76-e1c187a5abce" containerName="dnsmasq-dns" containerID="cri-o://fe18e9b9c621ba56e6aa5c811ad3e1795a7fced9b3def48c4109afffd2aa27d0" gracePeriod=10 Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.382630 4892 generic.go:334] "Generic (PLEG): container finished" podID="6002d110-e634-47ab-b33b-652cbf7b3466" containerID="d1e2d76942f9a3ceb5fa669e3e7d9d47b75b8b88c3c082dcdb6f34306b4cb650" exitCode=1 Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.382895 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerDied","Data":"d1e2d76942f9a3ceb5fa669e3e7d9d47b75b8b88c3c082dcdb6f34306b4cb650"} Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.383609 4892 scope.go:117] "RemoveContainer" containerID="d1e2d76942f9a3ceb5fa669e3e7d9d47b75b8b88c3c082dcdb6f34306b4cb650" Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.403611 4892 generic.go:334] "Generic (PLEG): container finished" podID="725ebf18-e01b-4408-af76-e1c187a5abce" containerID="fe18e9b9c621ba56e6aa5c811ad3e1795a7fced9b3def48c4109afffd2aa27d0" exitCode=0 Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.403659 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" event={"ID":"725ebf18-e01b-4408-af76-e1c187a5abce","Type":"ContainerDied","Data":"fe18e9b9c621ba56e6aa5c811ad3e1795a7fced9b3def48c4109afffd2aa27d0"} Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.403765 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.435281 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.475080 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:29 crc kubenswrapper[4892]: I1006 12:26:29.603495 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 06 12:26:30 crc kubenswrapper[4892]: I1006 12:26:30.290287 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:30 crc kubenswrapper[4892]: I1006 12:26:30.577692 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" podUID="725ebf18-e01b-4408-af76-e1c187a5abce" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.141:5353: connect: connection refused" Oct 06 12:26:31 crc kubenswrapper[4892]: I1006 12:26:31.426135 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="6b13a195-f0db-44f0-a6f2-5eddc9da4c87" containerName="watcher-applier" containerID="cri-o://a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7" gracePeriod=30 Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.435879 4892 generic.go:334] "Generic (PLEG): container finished" podID="657b347a-9a82-404a-b263-f51befcd5837" containerID="3c2a02d34a6fd831789c5c2c29b8643e0b8a93f9f084d92901abee5e94d3a7d3" exitCode=0 Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.435936 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rqpxp" event={"ID":"657b347a-9a82-404a-b263-f51befcd5837","Type":"ContainerDied","Data":"3c2a02d34a6fd831789c5c2c29b8643e0b8a93f9f084d92901abee5e94d3a7d3"} Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.671724 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.699610 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760286 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-sb\") pod \"725ebf18-e01b-4408-af76-e1c187a5abce\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760459 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-scripts\") pod \"2673cbbe-dc84-4a24-a48a-303029fcc02a\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760503 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2673cbbe-dc84-4a24-a48a-303029fcc02a-logs\") pod \"2673cbbe-dc84-4a24-a48a-303029fcc02a\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760521 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-svc\") pod \"725ebf18-e01b-4408-af76-e1c187a5abce\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760547 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-combined-ca-bundle\") pod \"2673cbbe-dc84-4a24-a48a-303029fcc02a\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760584 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9ssg\" (UniqueName: \"kubernetes.io/projected/725ebf18-e01b-4408-af76-e1c187a5abce-kube-api-access-z9ssg\") pod \"725ebf18-e01b-4408-af76-e1c187a5abce\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760613 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-swift-storage-0\") pod \"725ebf18-e01b-4408-af76-e1c187a5abce\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760648 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-config\") pod \"725ebf18-e01b-4408-af76-e1c187a5abce\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760675 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-nb\") pod \"725ebf18-e01b-4408-af76-e1c187a5abce\" (UID: \"725ebf18-e01b-4408-af76-e1c187a5abce\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760702 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr8rh\" (UniqueName: \"kubernetes.io/projected/2673cbbe-dc84-4a24-a48a-303029fcc02a-kube-api-access-mr8rh\") pod \"2673cbbe-dc84-4a24-a48a-303029fcc02a\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.760743 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-config-data\") pod \"2673cbbe-dc84-4a24-a48a-303029fcc02a\" (UID: \"2673cbbe-dc84-4a24-a48a-303029fcc02a\") " Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.766733 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2673cbbe-dc84-4a24-a48a-303029fcc02a-logs" (OuterVolumeSpecName: "logs") pod "2673cbbe-dc84-4a24-a48a-303029fcc02a" (UID: "2673cbbe-dc84-4a24-a48a-303029fcc02a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.770079 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2673cbbe-dc84-4a24-a48a-303029fcc02a-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.794694 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725ebf18-e01b-4408-af76-e1c187a5abce-kube-api-access-z9ssg" (OuterVolumeSpecName: "kube-api-access-z9ssg") pod "725ebf18-e01b-4408-af76-e1c187a5abce" (UID: "725ebf18-e01b-4408-af76-e1c187a5abce"). InnerVolumeSpecName "kube-api-access-z9ssg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.801456 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-scripts" (OuterVolumeSpecName: "scripts") pod "2673cbbe-dc84-4a24-a48a-303029fcc02a" (UID: "2673cbbe-dc84-4a24-a48a-303029fcc02a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.805743 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2673cbbe-dc84-4a24-a48a-303029fcc02a-kube-api-access-mr8rh" (OuterVolumeSpecName: "kube-api-access-mr8rh") pod "2673cbbe-dc84-4a24-a48a-303029fcc02a" (UID: "2673cbbe-dc84-4a24-a48a-303029fcc02a"). InnerVolumeSpecName "kube-api-access-mr8rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.830505 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2673cbbe-dc84-4a24-a48a-303029fcc02a" (UID: "2673cbbe-dc84-4a24-a48a-303029fcc02a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.856472 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-config-data" (OuterVolumeSpecName: "config-data") pod "2673cbbe-dc84-4a24-a48a-303029fcc02a" (UID: "2673cbbe-dc84-4a24-a48a-303029fcc02a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.866488 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-config" (OuterVolumeSpecName: "config") pod "725ebf18-e01b-4408-af76-e1c187a5abce" (UID: "725ebf18-e01b-4408-af76-e1c187a5abce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.876586 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.876615 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr8rh\" (UniqueName: \"kubernetes.io/projected/2673cbbe-dc84-4a24-a48a-303029fcc02a-kube-api-access-mr8rh\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.876625 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.876633 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.876642 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2673cbbe-dc84-4a24-a48a-303029fcc02a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.876650 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9ssg\" (UniqueName: \"kubernetes.io/projected/725ebf18-e01b-4408-af76-e1c187a5abce-kube-api-access-z9ssg\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.888794 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "725ebf18-e01b-4408-af76-e1c187a5abce" (UID: "725ebf18-e01b-4408-af76-e1c187a5abce"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.921108 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "725ebf18-e01b-4408-af76-e1c187a5abce" (UID: "725ebf18-e01b-4408-af76-e1c187a5abce"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.921839 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "725ebf18-e01b-4408-af76-e1c187a5abce" (UID: "725ebf18-e01b-4408-af76-e1c187a5abce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.927899 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "725ebf18-e01b-4408-af76-e1c187a5abce" (UID: "725ebf18-e01b-4408-af76-e1c187a5abce"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.978732 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.978764 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.978776 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:32 crc kubenswrapper[4892]: I1006 12:26:32.978785 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/725ebf18-e01b-4408-af76-e1c187a5abce-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.239700 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.247025 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 06 12:26:33 crc kubenswrapper[4892]: E1006 12:26:33.303712 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 06 12:26:33 crc kubenswrapper[4892]: E1006 12:26:33.307667 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 06 12:26:33 crc kubenswrapper[4892]: E1006 12:26:33.310600 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 06 12:26:33 crc kubenswrapper[4892]: E1006 12:26:33.310666 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="6b13a195-f0db-44f0-a6f2-5eddc9da4c87" containerName="watcher-applier" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.445186 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8t56w" event={"ID":"5de602d1-1bde-4049-88a7-d8132dee5d53","Type":"ContainerStarted","Data":"b2efbc72e8e26c11fc05ed783b9dd9ba1b5a6b45bd15433bbe4472edf69efd43"} Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.449835 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff0622c4-d0fd-4732-9418-58d60d081887","Type":"ContainerStarted","Data":"5d81adafc0eea76a3209fb496c1db4db2242eb936080e4c7e6cf5a46abf4fbc2"} Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.451710 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerStarted","Data":"496cfbd96e1426e4f91444daf5b7a3fa29a01be422c2afa18abcada3aeaeeba8"} Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.453111 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875","Type":"ContainerStarted","Data":"82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348"} Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.455038 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" event={"ID":"725ebf18-e01b-4408-af76-e1c187a5abce","Type":"ContainerDied","Data":"74169174aca2704629fe164c0ee4e479aadca3d064b69d0c2f631a943231bede"} Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.455072 4892 scope.go:117] "RemoveContainer" containerID="fe18e9b9c621ba56e6aa5c811ad3e1795a7fced9b3def48c4109afffd2aa27d0" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.455183 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55ddd64775-5m4mj" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.469301 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8t56w" podStartSLOduration=11.874996808 podStartE2EDuration="22.469284364s" podCreationTimestamp="2025-10-06 12:26:11 +0000 UTC" firstStartedPulling="2025-10-06 12:26:22.13016197 +0000 UTC m=+1068.679867745" lastFinishedPulling="2025-10-06 12:26:32.724449536 +0000 UTC m=+1079.274155301" observedRunningTime="2025-10-06 12:26:33.466574813 +0000 UTC m=+1080.016280578" watchObservedRunningTime="2025-10-06 12:26:33.469284364 +0000 UTC m=+1080.018990119" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.474757 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jps9b" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.475378 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jps9b" event={"ID":"2673cbbe-dc84-4a24-a48a-303029fcc02a","Type":"ContainerDied","Data":"63013211bf2e408f56e0437aa51ca2426f81112e28c0331428ae4401bdac2488"} Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.475410 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63013211bf2e408f56e0437aa51ca2426f81112e28c0331428ae4401bdac2488" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.482747 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.530694 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.530674501 podStartE2EDuration="8.530674501s" podCreationTimestamp="2025-10-06 12:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:33.515658249 +0000 UTC m=+1080.065364024" watchObservedRunningTime="2025-10-06 12:26:33.530674501 +0000 UTC m=+1080.080380266" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.537198 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.537181967 podStartE2EDuration="8.537181967s" podCreationTimestamp="2025-10-06 12:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:33.536655881 +0000 UTC m=+1080.086361646" watchObservedRunningTime="2025-10-06 12:26:33.537181967 +0000 UTC m=+1080.086887732" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.550110 4892 scope.go:117] "RemoveContainer" containerID="a1da148e2dcfeae87a224f2b83a3a6b51a059dd6a42d2f478d5553d9c4eff4a8" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.565030 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55ddd64775-5m4mj"] Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.572416 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55ddd64775-5m4mj"] Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.847851 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-586dc66ccd-lmkkz"] Oct 06 12:26:33 crc kubenswrapper[4892]: E1006 12:26:33.848502 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725ebf18-e01b-4408-af76-e1c187a5abce" containerName="dnsmasq-dns" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.848515 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="725ebf18-e01b-4408-af76-e1c187a5abce" containerName="dnsmasq-dns" Oct 06 12:26:33 crc kubenswrapper[4892]: E1006 12:26:33.848533 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2673cbbe-dc84-4a24-a48a-303029fcc02a" containerName="placement-db-sync" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.848540 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2673cbbe-dc84-4a24-a48a-303029fcc02a" containerName="placement-db-sync" Oct 06 12:26:33 crc kubenswrapper[4892]: E1006 12:26:33.848561 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725ebf18-e01b-4408-af76-e1c187a5abce" containerName="init" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.848583 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="725ebf18-e01b-4408-af76-e1c187a5abce" containerName="init" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.848762 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2673cbbe-dc84-4a24-a48a-303029fcc02a" containerName="placement-db-sync" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.848779 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="725ebf18-e01b-4408-af76-e1c187a5abce" containerName="dnsmasq-dns" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.849929 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.854164 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.854382 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.854716 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.854904 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.856379 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-jjlfz" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.887036 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-586dc66ccd-lmkkz"] Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.896777 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-config-data\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.896825 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e20d1322-c08c-4173-95f0-146b3c2cce04-logs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.896882 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-public-tls-certs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.896924 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-combined-ca-bundle\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.896949 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6t49\" (UniqueName: \"kubernetes.io/projected/e20d1322-c08c-4173-95f0-146b3c2cce04-kube-api-access-p6t49\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.896992 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-scripts\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:33 crc kubenswrapper[4892]: I1006 12:26:33.897007 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-internal-tls-certs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.002142 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-combined-ca-bundle\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.003112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6t49\" (UniqueName: \"kubernetes.io/projected/e20d1322-c08c-4173-95f0-146b3c2cce04-kube-api-access-p6t49\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.003709 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-scripts\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.004003 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-internal-tls-certs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.004220 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-config-data\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.004254 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e20d1322-c08c-4173-95f0-146b3c2cce04-logs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.007073 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-public-tls-certs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.013401 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e20d1322-c08c-4173-95f0-146b3c2cce04-logs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.037063 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-config-data\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.040721 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-scripts\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.041956 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-combined-ca-bundle\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.052928 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6t49\" (UniqueName: \"kubernetes.io/projected/e20d1322-c08c-4173-95f0-146b3c2cce04-kube-api-access-p6t49\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.073769 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-public-tls-certs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.075941 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e20d1322-c08c-4173-95f0-146b3c2cce04-internal-tls-certs\") pod \"placement-586dc66ccd-lmkkz\" (UID: \"e20d1322-c08c-4173-95f0-146b3c2cce04\") " pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.161003 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.198928 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-jjlfz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.208155 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.212372 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-combined-ca-bundle\") pod \"657b347a-9a82-404a-b263-f51befcd5837\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.212554 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p9jt\" (UniqueName: \"kubernetes.io/projected/657b347a-9a82-404a-b263-f51befcd5837-kube-api-access-4p9jt\") pod \"657b347a-9a82-404a-b263-f51befcd5837\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.212586 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-fernet-keys\") pod \"657b347a-9a82-404a-b263-f51befcd5837\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.212647 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-credential-keys\") pod \"657b347a-9a82-404a-b263-f51befcd5837\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.212674 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-config-data\") pod \"657b347a-9a82-404a-b263-f51befcd5837\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.212690 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-scripts\") pod \"657b347a-9a82-404a-b263-f51befcd5837\" (UID: \"657b347a-9a82-404a-b263-f51befcd5837\") " Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.217254 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-scripts" (OuterVolumeSpecName: "scripts") pod "657b347a-9a82-404a-b263-f51befcd5837" (UID: "657b347a-9a82-404a-b263-f51befcd5837"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.219591 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "657b347a-9a82-404a-b263-f51befcd5837" (UID: "657b347a-9a82-404a-b263-f51befcd5837"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.220208 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657b347a-9a82-404a-b263-f51befcd5837-kube-api-access-4p9jt" (OuterVolumeSpecName: "kube-api-access-4p9jt") pod "657b347a-9a82-404a-b263-f51befcd5837" (UID: "657b347a-9a82-404a-b263-f51befcd5837"). InnerVolumeSpecName "kube-api-access-4p9jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.229625 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "657b347a-9a82-404a-b263-f51befcd5837" (UID: "657b347a-9a82-404a-b263-f51befcd5837"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.232644 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725ebf18-e01b-4408-af76-e1c187a5abce" path="/var/lib/kubelet/pods/725ebf18-e01b-4408-af76-e1c187a5abce/volumes" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.277627 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "657b347a-9a82-404a-b263-f51befcd5837" (UID: "657b347a-9a82-404a-b263-f51befcd5837"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.278763 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-config-data" (OuterVolumeSpecName: "config-data") pod "657b347a-9a82-404a-b263-f51befcd5837" (UID: "657b347a-9a82-404a-b263-f51befcd5837"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.325636 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.325781 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p9jt\" (UniqueName: \"kubernetes.io/projected/657b347a-9a82-404a-b263-f51befcd5837-kube-api-access-4p9jt\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.325795 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.325804 4892 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.326076 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.326102 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657b347a-9a82-404a-b263-f51befcd5837-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.513863 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rqpxp" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.514185 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rqpxp" event={"ID":"657b347a-9a82-404a-b263-f51befcd5837","Type":"ContainerDied","Data":"66353aa284a28751ffd560e8b5b8011d58b90f09f45d070f576a894d8881c62c"} Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.514220 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66353aa284a28751ffd560e8b5b8011d58b90f09f45d070f576a894d8881c62c" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.623130 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5bd5b56c96-qhdwx"] Oct 06 12:26:34 crc kubenswrapper[4892]: E1006 12:26:34.624012 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657b347a-9a82-404a-b263-f51befcd5837" containerName="keystone-bootstrap" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.624028 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="657b347a-9a82-404a-b263-f51befcd5837" containerName="keystone-bootstrap" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.624225 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="657b347a-9a82-404a-b263-f51befcd5837" containerName="keystone-bootstrap" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.624882 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.630598 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.630871 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.631110 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.631215 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qj9t" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.631362 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.631507 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.649843 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bd5b56c96-qhdwx"] Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.736603 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-internal-tls-certs\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.736655 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-public-tls-certs\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.736697 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-credential-keys\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.736752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwftj\" (UniqueName: \"kubernetes.io/projected/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-kube-api-access-zwftj\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.736788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-fernet-keys\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.736811 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-scripts\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.736941 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-config-data\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.736983 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-combined-ca-bundle\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.753740 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-586dc66ccd-lmkkz"] Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.839111 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-internal-tls-certs\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.839547 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-public-tls-certs\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.839596 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-credential-keys\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.839662 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwftj\" (UniqueName: \"kubernetes.io/projected/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-kube-api-access-zwftj\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.839706 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-fernet-keys\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.839745 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-scripts\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.839793 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-config-data\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.839827 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-combined-ca-bundle\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.844456 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-combined-ca-bundle\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.847961 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-scripts\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.850685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-fernet-keys\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.851855 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-public-tls-certs\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.865823 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-internal-tls-certs\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.866557 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-config-data\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.869556 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-credential-keys\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.870808 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwftj\" (UniqueName: \"kubernetes.io/projected/6493eabc-ee54-41dd-a9e4-dfa55fe71dd1-kube-api-access-zwftj\") pod \"keystone-5bd5b56c96-qhdwx\" (UID: \"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1\") " pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:34 crc kubenswrapper[4892]: I1006 12:26:34.963191 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:35 crc kubenswrapper[4892]: I1006 12:26:35.598878 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bd5b56c96-qhdwx"] Oct 06 12:26:35 crc kubenswrapper[4892]: I1006 12:26:35.603391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586dc66ccd-lmkkz" event={"ID":"e20d1322-c08c-4173-95f0-146b3c2cce04","Type":"ContainerStarted","Data":"1ac1c89eefdf992bf5e4bf2f095d253cef9d2cb3d6b75b27819251004d9b6a46"} Oct 06 12:26:35 crc kubenswrapper[4892]: I1006 12:26:35.603431 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586dc66ccd-lmkkz" event={"ID":"e20d1322-c08c-4173-95f0-146b3c2cce04","Type":"ContainerStarted","Data":"e3af967faa52b74adac232ef521813d5d22a044f17361740314ccb716180c5ae"} Oct 06 12:26:35 crc kubenswrapper[4892]: I1006 12:26:35.603442 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-586dc66ccd-lmkkz" event={"ID":"e20d1322-c08c-4173-95f0-146b3c2cce04","Type":"ContainerStarted","Data":"52475296bf0742cdbc46fb70dab06381e99bfe64f52c7b490f8031a55eb30998"} Oct 06 12:26:35 crc kubenswrapper[4892]: I1006 12:26:35.604617 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:35 crc kubenswrapper[4892]: I1006 12:26:35.604644 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:26:35 crc kubenswrapper[4892]: W1006 12:26:35.666418 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6493eabc_ee54_41dd_a9e4_dfa55fe71dd1.slice/crio-209f0f6063d9e7b49d3448bf70e4dd808cda27faf983e229504123c325b465df WatchSource:0}: Error finding container 209f0f6063d9e7b49d3448bf70e4dd808cda27faf983e229504123c325b465df: Status 404 returned error can't find the container with id 209f0f6063d9e7b49d3448bf70e4dd808cda27faf983e229504123c325b465df Oct 06 12:26:35 crc kubenswrapper[4892]: I1006 12:26:35.931537 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 12:26:35 crc kubenswrapper[4892]: I1006 12:26:35.931815 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.002216 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.031762 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-586dc66ccd-lmkkz" podStartSLOduration=3.031744168 podStartE2EDuration="3.031744168s" podCreationTimestamp="2025-10-06 12:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:35.634477906 +0000 UTC m=+1082.184183671" watchObservedRunningTime="2025-10-06 12:26:36.031744168 +0000 UTC m=+1082.581449933" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.062761 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.162213 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.165742 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-combined-ca-bundle\") pod \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.165840 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-config-data\") pod \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.165951 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-logs\") pod \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.166018 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dsf7\" (UniqueName: \"kubernetes.io/projected/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-kube-api-access-4dsf7\") pod \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\" (UID: \"6b13a195-f0db-44f0-a6f2-5eddc9da4c87\") " Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.170378 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-logs" (OuterVolumeSpecName: "logs") pod "6b13a195-f0db-44f0-a6f2-5eddc9da4c87" (UID: "6b13a195-f0db-44f0-a6f2-5eddc9da4c87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.175596 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-kube-api-access-4dsf7" (OuterVolumeSpecName: "kube-api-access-4dsf7") pod "6b13a195-f0db-44f0-a6f2-5eddc9da4c87" (UID: "6b13a195-f0db-44f0-a6f2-5eddc9da4c87"). InnerVolumeSpecName "kube-api-access-4dsf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.247933 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b13a195-f0db-44f0-a6f2-5eddc9da4c87" (UID: "6b13a195-f0db-44f0-a6f2-5eddc9da4c87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.267297 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.267352 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dsf7\" (UniqueName: \"kubernetes.io/projected/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-kube-api-access-4dsf7\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.267363 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.272426 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-config-data" (OuterVolumeSpecName: "config-data") pod "6b13a195-f0db-44f0-a6f2-5eddc9da4c87" (UID: "6b13a195-f0db-44f0-a6f2-5eddc9da4c87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.370561 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b13a195-f0db-44f0-a6f2-5eddc9da4c87-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.405609 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.407107 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.466458 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.469941 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.620883 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bd5b56c96-qhdwx" event={"ID":"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1","Type":"ContainerStarted","Data":"b3de3b97bb9a89db2ef856e6d15d77afef6d73dd57d1f216a71132e375240420"} Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.620930 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bd5b56c96-qhdwx" event={"ID":"6493eabc-ee54-41dd-a9e4-dfa55fe71dd1","Type":"ContainerStarted","Data":"209f0f6063d9e7b49d3448bf70e4dd808cda27faf983e229504123c325b465df"} Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.620970 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.623160 4892 generic.go:334] "Generic (PLEG): container finished" podID="6b13a195-f0db-44f0-a6f2-5eddc9da4c87" containerID="a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7" exitCode=0 Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.625488 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.625795 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6b13a195-f0db-44f0-a6f2-5eddc9da4c87","Type":"ContainerDied","Data":"a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7"} Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.625823 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.625834 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"6b13a195-f0db-44f0-a6f2-5eddc9da4c87","Type":"ContainerDied","Data":"efa627317e924640ec252a2b519da77e3228e2f29cbccd33957a01ffba8b29c1"} Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.625845 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.625863 4892 scope.go:117] "RemoveContainer" containerID="a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.625992 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.626576 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.648028 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5bd5b56c96-qhdwx" podStartSLOduration=2.648010379 podStartE2EDuration="2.648010379s" podCreationTimestamp="2025-10-06 12:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:36.644644118 +0000 UTC m=+1083.194349883" watchObservedRunningTime="2025-10-06 12:26:36.648010379 +0000 UTC m=+1083.197716144" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.677037 4892 scope.go:117] "RemoveContainer" containerID="a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.677130 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:36 crc kubenswrapper[4892]: E1006 12:26:36.678060 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7\": container with ID starting with a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7 not found: ID does not exist" containerID="a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.678090 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7"} err="failed to get container status \"a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7\": rpc error: code = NotFound desc = could not find container \"a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7\": container with ID starting with a046871027bce752537d2b84ace2da52dfb927cb0c9a6e798673cabefefaa0d7 not found: ID does not exist" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.775379 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.781035 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:36 crc kubenswrapper[4892]: E1006 12:26:36.782648 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b13a195-f0db-44f0-a6f2-5eddc9da4c87" containerName="watcher-applier" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.782673 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b13a195-f0db-44f0-a6f2-5eddc9da4c87" containerName="watcher-applier" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.783046 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b13a195-f0db-44f0-a6f2-5eddc9da4c87" containerName="watcher-applier" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.787592 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7489d9984-82d5x" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.167:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.167:8443: connect: connection refused" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.804200 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.806222 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.816507 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.889177 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c68c58656-gbbdd" podUID="f038239e-35e8-4409-a858-d7aad410f5fd" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.168:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.168:8443: connect: connection refused" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.912073 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.912186 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-logs\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.912246 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-config-data\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:36 crc kubenswrapper[4892]: I1006 12:26:36.912268 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stwk4\" (UniqueName: \"kubernetes.io/projected/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-kube-api-access-stwk4\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.014426 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-config-data\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.014685 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stwk4\" (UniqueName: \"kubernetes.io/projected/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-kube-api-access-stwk4\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.014785 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.014848 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-logs\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.015131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-logs\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.025196 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.032052 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-config-data\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.040879 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stwk4\" (UniqueName: \"kubernetes.io/projected/10c6d4d0-3a47-4756-a5bb-65ff70e4c677-kube-api-access-stwk4\") pod \"watcher-applier-0\" (UID: \"10c6d4d0-3a47-4756-a5bb-65ff70e4c677\") " pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.150474 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.240239 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.240495 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api-log" containerID="cri-o://4cc36948b06fe9f745c389827ff6fcac12829016f8122d8ccb102e44c82f19e1" gracePeriod=30 Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.241664 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" containerID="cri-o://c75ea41f40beccff9afaf15c1b9ce11f7c8a4e08308724cd8eb9bbb791b3367b" gracePeriod=30 Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.727683 4892 generic.go:334] "Generic (PLEG): container finished" podID="6002d110-e634-47ab-b33b-652cbf7b3466" containerID="496cfbd96e1426e4f91444daf5b7a3fa29a01be422c2afa18abcada3aeaeeba8" exitCode=1 Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.728033 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerDied","Data":"496cfbd96e1426e4f91444daf5b7a3fa29a01be422c2afa18abcada3aeaeeba8"} Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.728066 4892 scope.go:117] "RemoveContainer" containerID="d1e2d76942f9a3ceb5fa669e3e7d9d47b75b8b88c3c082dcdb6f34306b4cb650" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.729113 4892 scope.go:117] "RemoveContainer" containerID="496cfbd96e1426e4f91444daf5b7a3fa29a01be422c2afa18abcada3aeaeeba8" Oct 06 12:26:37 crc kubenswrapper[4892]: E1006 12:26:37.729486 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6002d110-e634-47ab-b33b-652cbf7b3466)\"" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.769005 4892 generic.go:334] "Generic (PLEG): container finished" podID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerID="4cc36948b06fe9f745c389827ff6fcac12829016f8122d8ccb102e44c82f19e1" exitCode=143 Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.769072 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a61c1481-8c3c-490a-97eb-a03156bb7ee5","Type":"ContainerDied","Data":"4cc36948b06fe9f745c389827ff6fcac12829016f8122d8ccb102e44c82f19e1"} Oct 06 12:26:37 crc kubenswrapper[4892]: I1006 12:26:37.782768 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 06 12:26:38 crc kubenswrapper[4892]: I1006 12:26:38.055845 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:38 crc kubenswrapper[4892]: I1006 12:26:38.055915 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:38 crc kubenswrapper[4892]: I1006 12:26:38.055932 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:38 crc kubenswrapper[4892]: I1006 12:26:38.055954 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:38 crc kubenswrapper[4892]: I1006 12:26:38.192878 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b13a195-f0db-44f0-a6f2-5eddc9da4c87" path="/var/lib/kubelet/pods/6b13a195-f0db-44f0-a6f2-5eddc9da4c87/volumes" Oct 06 12:26:38 crc kubenswrapper[4892]: I1006 12:26:38.795732 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 12:26:38 crc kubenswrapper[4892]: I1006 12:26:38.795742 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 12:26:38 crc kubenswrapper[4892]: I1006 12:26:38.796234 4892 scope.go:117] "RemoveContainer" containerID="496cfbd96e1426e4f91444daf5b7a3fa29a01be422c2afa18abcada3aeaeeba8" Oct 06 12:26:38 crc kubenswrapper[4892]: E1006 12:26:38.796602 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6002d110-e634-47ab-b33b-652cbf7b3466)\"" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.156283 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": read tcp 10.217.0.2:36052->10.217.0.169:9322: read: connection reset by peer" Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.156764 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": read tcp 10.217.0.2:36040->10.217.0.169:9322: read: connection reset by peer" Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.164786 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.234733 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.241011 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.806965 4892 generic.go:334] "Generic (PLEG): container finished" podID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerID="c75ea41f40beccff9afaf15c1b9ce11f7c8a4e08308724cd8eb9bbb791b3367b" exitCode=0 Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.807041 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a61c1481-8c3c-490a-97eb-a03156bb7ee5","Type":"ContainerDied","Data":"c75ea41f40beccff9afaf15c1b9ce11f7c8a4e08308724cd8eb9bbb791b3367b"} Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.808174 4892 generic.go:334] "Generic (PLEG): container finished" podID="5de602d1-1bde-4049-88a7-d8132dee5d53" containerID="b2efbc72e8e26c11fc05ed783b9dd9ba1b5a6b45bd15433bbe4472edf69efd43" exitCode=0 Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.808834 4892 scope.go:117] "RemoveContainer" containerID="496cfbd96e1426e4f91444daf5b7a3fa29a01be422c2afa18abcada3aeaeeba8" Oct 06 12:26:39 crc kubenswrapper[4892]: E1006 12:26:39.809117 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6002d110-e634-47ab-b33b-652cbf7b3466)\"" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" Oct 06 12:26:39 crc kubenswrapper[4892]: I1006 12:26:39.809254 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8t56w" event={"ID":"5de602d1-1bde-4049-88a7-d8132dee5d53","Type":"ContainerDied","Data":"b2efbc72e8e26c11fc05ed783b9dd9ba1b5a6b45bd15433bbe4472edf69efd43"} Oct 06 12:26:40 crc kubenswrapper[4892]: I1006 12:26:40.677661 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 12:26:41 crc kubenswrapper[4892]: W1006 12:26:41.620405 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10c6d4d0_3a47_4756_a5bb_65ff70e4c677.slice/crio-ac6540d7f4c6f3bf470a5bacc018de040461ed69cc84abf95db88b18d616d3b2 WatchSource:0}: Error finding container ac6540d7f4c6f3bf470a5bacc018de040461ed69cc84abf95db88b18d616d3b2: Status 404 returned error can't find the container with id ac6540d7f4c6f3bf470a5bacc018de040461ed69cc84abf95db88b18d616d3b2 Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.757083 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.830481 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-combined-ca-bundle\") pod \"5de602d1-1bde-4049-88a7-d8132dee5d53\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.830605 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msbpc\" (UniqueName: \"kubernetes.io/projected/5de602d1-1bde-4049-88a7-d8132dee5d53-kube-api-access-msbpc\") pod \"5de602d1-1bde-4049-88a7-d8132dee5d53\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.830651 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-db-sync-config-data\") pod \"5de602d1-1bde-4049-88a7-d8132dee5d53\" (UID: \"5de602d1-1bde-4049-88a7-d8132dee5d53\") " Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.855820 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5de602d1-1bde-4049-88a7-d8132dee5d53" (UID: "5de602d1-1bde-4049-88a7-d8132dee5d53"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.868343 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8t56w" event={"ID":"5de602d1-1bde-4049-88a7-d8132dee5d53","Type":"ContainerDied","Data":"22262330c23f861a3be1686e2c4b09d75d34d5cbb486f5595bccb97a078af36c"} Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.868400 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22262330c23f861a3be1686e2c4b09d75d34d5cbb486f5595bccb97a078af36c" Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.869106 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8t56w" Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.870617 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"10c6d4d0-3a47-4756-a5bb-65ff70e4c677","Type":"ContainerStarted","Data":"ac6540d7f4c6f3bf470a5bacc018de040461ed69cc84abf95db88b18d616d3b2"} Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.871403 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5de602d1-1bde-4049-88a7-d8132dee5d53" (UID: "5de602d1-1bde-4049-88a7-d8132dee5d53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.871520 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de602d1-1bde-4049-88a7-d8132dee5d53-kube-api-access-msbpc" (OuterVolumeSpecName: "kube-api-access-msbpc") pod "5de602d1-1bde-4049-88a7-d8132dee5d53" (UID: "5de602d1-1bde-4049-88a7-d8132dee5d53"). InnerVolumeSpecName "kube-api-access-msbpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.932594 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.932870 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msbpc\" (UniqueName: \"kubernetes.io/projected/5de602d1-1bde-4049-88a7-d8132dee5d53-kube-api-access-msbpc\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:41 crc kubenswrapper[4892]: I1006 12:26:41.932883 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5de602d1-1bde-4049-88a7-d8132dee5d53-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.056501 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5f964b9fb4-v4lhn"] Oct 06 12:26:42 crc kubenswrapper[4892]: E1006 12:26:42.057075 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de602d1-1bde-4049-88a7-d8132dee5d53" containerName="barbican-db-sync" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.057094 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de602d1-1bde-4049-88a7-d8132dee5d53" containerName="barbican-db-sync" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.057362 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de602d1-1bde-4049-88a7-d8132dee5d53" containerName="barbican-db-sync" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.058414 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.061930 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.074680 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6984f8c567-2sx4r"] Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.076550 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.086370 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.121007 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6984f8c567-2sx4r"] Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135299 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-config-data-custom\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135368 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-logs\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135421 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-config-data\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135478 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-config-data\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135503 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnfb6\" (UniqueName: \"kubernetes.io/projected/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-kube-api-access-pnfb6\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135546 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c02562-a51b-42c5-8797-fe351a4932f7-logs\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135568 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-config-data-custom\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135584 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-combined-ca-bundle\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135610 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-combined-ca-bundle\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.135642 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mjj\" (UniqueName: \"kubernetes.io/projected/72c02562-a51b-42c5-8797-fe351a4932f7-kube-api-access-47mjj\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.153810 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f964b9fb4-v4lhn"] Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.199417 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-554467fdd7-n2trc"] Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.209204 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-554467fdd7-n2trc"] Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.209669 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.236862 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnfb6\" (UniqueName: \"kubernetes.io/projected/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-kube-api-access-pnfb6\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.236918 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-sb\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.236984 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c02562-a51b-42c5-8797-fe351a4932f7-logs\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237008 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-config-data-custom\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237028 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-combined-ca-bundle\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237051 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-nb\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237077 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-combined-ca-bundle\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237104 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-svc\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237126 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47mjj\" (UniqueName: \"kubernetes.io/projected/72c02562-a51b-42c5-8797-fe351a4932f7-kube-api-access-47mjj\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237151 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-swift-storage-0\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237183 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-config-data-custom\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237200 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-logs\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237254 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-config-data\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237279 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2g6t\" (UniqueName: \"kubernetes.io/projected/b93078cc-28b0-4b5f-a4f9-af91631acd25-kube-api-access-v2g6t\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237383 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-config\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.237403 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-config-data\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.238392 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72c02562-a51b-42c5-8797-fe351a4932f7-logs\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.239964 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-logs\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.242493 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-config-data-custom\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.243639 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-combined-ca-bundle\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.252477 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-config-data-custom\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.255051 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-config-data\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.263168 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-combined-ca-bundle\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.264232 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72c02562-a51b-42c5-8797-fe351a4932f7-config-data\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.266532 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnfb6\" (UniqueName: \"kubernetes.io/projected/eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a-kube-api-access-pnfb6\") pod \"barbican-keystone-listener-5f964b9fb4-v4lhn\" (UID: \"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a\") " pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.269942 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47mjj\" (UniqueName: \"kubernetes.io/projected/72c02562-a51b-42c5-8797-fe351a4932f7-kube-api-access-47mjj\") pod \"barbican-worker-6984f8c567-2sx4r\" (UID: \"72c02562-a51b-42c5-8797-fe351a4932f7\") " pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.294829 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-657b8d7546-4g6v6"] Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.297286 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.299947 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.314305 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-657b8d7546-4g6v6"] Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.346796 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-swift-storage-0\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.347498 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2g6t\" (UniqueName: \"kubernetes.io/projected/b93078cc-28b0-4b5f-a4f9-af91631acd25-kube-api-access-v2g6t\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348000 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data-custom\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348056 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348099 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-combined-ca-bundle\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348150 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-config\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348228 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e60546e-29f5-4c31-9124-1ee0f28340e8-logs\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348265 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-sb\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348748 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b4k8\" (UniqueName: \"kubernetes.io/projected/0e60546e-29f5-4c31-9124-1ee0f28340e8-kube-api-access-9b4k8\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348828 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-nb\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.348904 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-svc\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.349179 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-swift-storage-0\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.349524 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-sb\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.349771 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-svc\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.349988 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-nb\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.350073 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-config\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.362604 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2g6t\" (UniqueName: \"kubernetes.io/projected/b93078cc-28b0-4b5f-a4f9-af91631acd25-kube-api-access-v2g6t\") pod \"dnsmasq-dns-554467fdd7-n2trc\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.390503 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.403158 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6984f8c567-2sx4r" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.451718 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data-custom\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.451757 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.451779 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-combined-ca-bundle\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.451815 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e60546e-29f5-4c31-9124-1ee0f28340e8-logs\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.451856 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b4k8\" (UniqueName: \"kubernetes.io/projected/0e60546e-29f5-4c31-9124-1ee0f28340e8-kube-api-access-9b4k8\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.452529 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e60546e-29f5-4c31-9124-1ee0f28340e8-logs\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.456340 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.457990 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-combined-ca-bundle\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.460907 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data-custom\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.470502 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b4k8\" (UniqueName: \"kubernetes.io/projected/0e60546e-29f5-4c31-9124-1ee0f28340e8-kube-api-access-9b4k8\") pod \"barbican-api-657b8d7546-4g6v6\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.537756 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:42 crc kubenswrapper[4892]: I1006 12:26:42.625223 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.667577 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-54cccc9b7d-psd8k"] Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.669277 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.671582 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.671851 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.703471 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54cccc9b7d-psd8k"] Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.703557 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-config-data\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.703591 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-public-tls-certs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.703634 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-config-data-custom\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.703700 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-combined-ca-bundle\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.703723 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtkqx\" (UniqueName: \"kubernetes.io/projected/fd885701-3239-4926-8393-6e671e0f3b22-kube-api-access-rtkqx\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.703741 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-internal-tls-certs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.703768 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd885701-3239-4926-8393-6e671e0f3b22-logs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.805098 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-config-data-custom\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.805196 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-combined-ca-bundle\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.805223 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtkqx\" (UniqueName: \"kubernetes.io/projected/fd885701-3239-4926-8393-6e671e0f3b22-kube-api-access-rtkqx\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.805242 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-internal-tls-certs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.805273 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd885701-3239-4926-8393-6e671e0f3b22-logs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.805360 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-config-data\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.805385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-public-tls-certs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.806173 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd885701-3239-4926-8393-6e671e0f3b22-logs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.810974 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-public-tls-certs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.811442 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-config-data-custom\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.811847 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-config-data\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.812113 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-internal-tls-certs\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.812412 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd885701-3239-4926-8393-6e671e0f3b22-combined-ca-bundle\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.825695 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtkqx\" (UniqueName: \"kubernetes.io/projected/fd885701-3239-4926-8393-6e671e0f3b22-kube-api-access-rtkqx\") pod \"barbican-api-54cccc9b7d-psd8k\" (UID: \"fd885701-3239-4926-8393-6e671e0f3b22\") " pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:44 crc kubenswrapper[4892]: I1006 12:26:44.995439 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:48 crc kubenswrapper[4892]: I1006 12:26:48.240825 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:26:48 crc kubenswrapper[4892]: I1006 12:26:48.240940 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:26:48 crc kubenswrapper[4892]: I1006 12:26:48.528229 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:48 crc kubenswrapper[4892]: I1006 12:26:48.549448 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:49 crc kubenswrapper[4892]: I1006 12:26:49.955059 4892 generic.go:334] "Generic (PLEG): container finished" podID="de86e5ee-d52e-4d8b-8077-a0d86175878c" containerID="c8649c4a844019867546376f544e67b95ac672156102e7d197d5fe6c9dfd3682" exitCode=0 Oct 06 12:26:49 crc kubenswrapper[4892]: I1006 12:26:49.955120 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-knkh9" event={"ID":"de86e5ee-d52e-4d8b-8077-a0d86175878c","Type":"ContainerDied","Data":"c8649c4a844019867546376f544e67b95ac672156102e7d197d5fe6c9dfd3682"} Oct 06 12:26:50 crc kubenswrapper[4892]: I1006 12:26:50.154010 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6c68c58656-gbbdd" Oct 06 12:26:50 crc kubenswrapper[4892]: I1006 12:26:50.163100 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:26:50 crc kubenswrapper[4892]: I1006 12:26:50.168744 4892 scope.go:117] "RemoveContainer" containerID="496cfbd96e1426e4f91444daf5b7a3fa29a01be422c2afa18abcada3aeaeeba8" Oct 06 12:26:50 crc kubenswrapper[4892]: I1006 12:26:50.207744 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7489d9984-82d5x"] Oct 06 12:26:50 crc kubenswrapper[4892]: I1006 12:26:50.963961 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7489d9984-82d5x" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon-log" containerID="cri-o://b929f09ed20951199fcf75f8cafe3036227c5cbc150094f8ea1bac9f3e6ae072" gracePeriod=30 Oct 06 12:26:50 crc kubenswrapper[4892]: I1006 12:26:50.964098 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7489d9984-82d5x" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon" containerID="cri-o://18eac723022a92e25d1293baa3888a63d89e6f7b88835b26ef947985f501f48b" gracePeriod=30 Oct 06 12:26:51 crc kubenswrapper[4892]: I1006 12:26:51.980197 4892 generic.go:334] "Generic (PLEG): container finished" podID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerID="18eac723022a92e25d1293baa3888a63d89e6f7b88835b26ef947985f501f48b" exitCode=0 Oct 06 12:26:51 crc kubenswrapper[4892]: I1006 12:26:51.980266 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7489d9984-82d5x" event={"ID":"b229a3b8-5243-4ec5-8970-d69b61553a4b","Type":"ContainerDied","Data":"18eac723022a92e25d1293baa3888a63d89e6f7b88835b26ef947985f501f48b"} Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.384461 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.453465 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-config\") pod \"de86e5ee-d52e-4d8b-8077-a0d86175878c\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.453815 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-combined-ca-bundle\") pod \"de86e5ee-d52e-4d8b-8077-a0d86175878c\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.453933 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w26q6\" (UniqueName: \"kubernetes.io/projected/de86e5ee-d52e-4d8b-8077-a0d86175878c-kube-api-access-w26q6\") pod \"de86e5ee-d52e-4d8b-8077-a0d86175878c\" (UID: \"de86e5ee-d52e-4d8b-8077-a0d86175878c\") " Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.473851 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de86e5ee-d52e-4d8b-8077-a0d86175878c-kube-api-access-w26q6" (OuterVolumeSpecName: "kube-api-access-w26q6") pod "de86e5ee-d52e-4d8b-8077-a0d86175878c" (UID: "de86e5ee-d52e-4d8b-8077-a0d86175878c"). InnerVolumeSpecName "kube-api-access-w26q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.505017 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-config" (OuterVolumeSpecName: "config") pod "de86e5ee-d52e-4d8b-8077-a0d86175878c" (UID: "de86e5ee-d52e-4d8b-8077-a0d86175878c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.505521 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de86e5ee-d52e-4d8b-8077-a0d86175878c" (UID: "de86e5ee-d52e-4d8b-8077-a0d86175878c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.557009 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w26q6\" (UniqueName: \"kubernetes.io/projected/de86e5ee-d52e-4d8b-8077-a0d86175878c-kube-api-access-w26q6\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.557065 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.557086 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de86e5ee-d52e-4d8b-8077-a0d86175878c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.998298 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-knkh9" Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.998281 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-knkh9" event={"ID":"de86e5ee-d52e-4d8b-8077-a0d86175878c","Type":"ContainerDied","Data":"3965ee4999104e6e3254165ef652d19371613d91f4066a51e7ece5c231b24633"} Oct 06 12:26:52 crc kubenswrapper[4892]: I1006 12:26:52.998420 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3965ee4999104e6e3254165ef652d19371613d91f4066a51e7ece5c231b24633" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.242518 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.242600 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.242858 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.243429 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 12:26:53 crc kubenswrapper[4892]: E1006 12:26:53.600376 4892 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.98:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 06 12:26:53 crc kubenswrapper[4892]: E1006 12:26:53.600437 4892 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.98:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 06 12:26:53 crc kubenswrapper[4892]: E1006 12:26:53.600569 4892 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.98:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-px9bw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rk9bz_openstack(eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 06 12:26:53 crc kubenswrapper[4892]: E1006 12:26:53.603421 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rk9bz" podUID="eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.651261 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554467fdd7-n2trc"] Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.695146 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78555bc94f-rgqkh"] Oct 06 12:26:53 crc kubenswrapper[4892]: E1006 12:26:53.695884 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de86e5ee-d52e-4d8b-8077-a0d86175878c" containerName="neutron-db-sync" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.695909 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="de86e5ee-d52e-4d8b-8077-a0d86175878c" containerName="neutron-db-sync" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.696190 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="de86e5ee-d52e-4d8b-8077-a0d86175878c" containerName="neutron-db-sync" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.706435 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.735083 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.745629 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78555bc94f-rgqkh"] Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.786012 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a61c1481-8c3c-490a-97eb-a03156bb7ee5-logs\") pod \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.786481 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-combined-ca-bundle\") pod \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.786879 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-custom-prometheus-ca\") pod \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.786946 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvz8\" (UniqueName: \"kubernetes.io/projected/a61c1481-8c3c-490a-97eb-a03156bb7ee5-kube-api-access-wkvz8\") pod \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787017 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-config-data\") pod \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\" (UID: \"a61c1481-8c3c-490a-97eb-a03156bb7ee5\") " Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787038 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a61c1481-8c3c-490a-97eb-a03156bb7ee5-logs" (OuterVolumeSpecName: "logs") pod "a61c1481-8c3c-490a-97eb-a03156bb7ee5" (UID: "a61c1481-8c3c-490a-97eb-a03156bb7ee5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787302 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5hr\" (UniqueName: \"kubernetes.io/projected/cac926c3-363d-4d97-ad7a-aab9960020ea-kube-api-access-ql5hr\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787337 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-sb\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787362 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-svc\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787395 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-config\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787449 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-swift-storage-0\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787505 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-nb\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.787549 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a61c1481-8c3c-490a-97eb-a03156bb7ee5-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.806461 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a61c1481-8c3c-490a-97eb-a03156bb7ee5-kube-api-access-wkvz8" (OuterVolumeSpecName: "kube-api-access-wkvz8") pod "a61c1481-8c3c-490a-97eb-a03156bb7ee5" (UID: "a61c1481-8c3c-490a-97eb-a03156bb7ee5"). InnerVolumeSpecName "kube-api-access-wkvz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.831440 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a61c1481-8c3c-490a-97eb-a03156bb7ee5" (UID: "a61c1481-8c3c-490a-97eb-a03156bb7ee5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.856469 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a61c1481-8c3c-490a-97eb-a03156bb7ee5" (UID: "a61c1481-8c3c-490a-97eb-a03156bb7ee5"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.860558 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-config-data" (OuterVolumeSpecName: "config-data") pod "a61c1481-8c3c-490a-97eb-a03156bb7ee5" (UID: "a61c1481-8c3c-490a-97eb-a03156bb7ee5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.888908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-swift-storage-0\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889007 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-nb\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889082 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5hr\" (UniqueName: \"kubernetes.io/projected/cac926c3-363d-4d97-ad7a-aab9960020ea-kube-api-access-ql5hr\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889114 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-sb\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889137 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-svc\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889166 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-config\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889236 4892 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889246 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvz8\" (UniqueName: \"kubernetes.io/projected/a61c1481-8c3c-490a-97eb-a03156bb7ee5-kube-api-access-wkvz8\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889269 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.889277 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a61c1481-8c3c-490a-97eb-a03156bb7ee5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.890200 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-config\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.890844 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-swift-storage-0\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.891479 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-nb\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.893015 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-sb\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.893534 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-svc\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.945711 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5hr\" (UniqueName: \"kubernetes.io/projected/cac926c3-363d-4d97-ad7a-aab9960020ea-kube-api-access-ql5hr\") pod \"dnsmasq-dns-78555bc94f-rgqkh\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.954128 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68f547b878-th9sd"] Oct 06 12:26:53 crc kubenswrapper[4892]: E1006 12:26:53.954599 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.954611 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" Oct 06 12:26:53 crc kubenswrapper[4892]: E1006 12:26:53.954636 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api-log" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.954643 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api-log" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.954810 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.954826 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api-log" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.955933 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.964860 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68f547b878-th9sd"] Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.980961 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.981284 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.981449 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-jpd6d" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.981599 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.993593 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-config\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.993675 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-httpd-config\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.993706 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfpxw\" (UniqueName: \"kubernetes.io/projected/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-kube-api-access-pfpxw\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.993777 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-ovndb-tls-certs\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:53 crc kubenswrapper[4892]: I1006 12:26:53.993802 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-combined-ca-bundle\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.043566 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.082982 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.083446 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a61c1481-8c3c-490a-97eb-a03156bb7ee5","Type":"ContainerDied","Data":"b03ba473369ee8bb2ae82525d3626fc39e4432b1858b6db4708894f23dff7162"} Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.083521 4892 scope.go:117] "RemoveContainer" containerID="c75ea41f40beccff9afaf15c1b9ce11f7c8a4e08308724cd8eb9bbb791b3367b" Oct 06 12:26:54 crc kubenswrapper[4892]: E1006 12:26:54.085868 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.98:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-rk9bz" podUID="eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.095560 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-httpd-config\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.095604 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfpxw\" (UniqueName: \"kubernetes.io/projected/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-kube-api-access-pfpxw\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.095685 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-ovndb-tls-certs\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.095702 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-combined-ca-bundle\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.095775 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-config\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.135959 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfpxw\" (UniqueName: \"kubernetes.io/projected/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-kube-api-access-pfpxw\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.141153 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-config\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.157823 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-ovndb-tls-certs\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.158022 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-combined-ca-bundle\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.159063 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-httpd-config\") pod \"neutron-68f547b878-th9sd\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.287590 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.336719 4892 scope.go:117] "RemoveContainer" containerID="4cc36948b06fe9f745c389827ff6fcac12829016f8122d8ccb102e44c82f19e1" Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.789099 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-54cccc9b7d-psd8k"] Oct 06 12:26:54 crc kubenswrapper[4892]: I1006 12:26:54.976891 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.025372 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-config-data\") pod \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.025413 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdbff37-e2f5-4972-b932-6b53278fdaf9-logs\") pod \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.025491 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7cdbff37-e2f5-4972-b932-6b53278fdaf9-horizon-secret-key\") pod \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.025522 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-scripts\") pod \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.025544 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh9mc\" (UniqueName: \"kubernetes.io/projected/7cdbff37-e2f5-4972-b932-6b53278fdaf9-kube-api-access-gh9mc\") pod \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\" (UID: \"7cdbff37-e2f5-4972-b932-6b53278fdaf9\") " Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.028934 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cdbff37-e2f5-4972-b932-6b53278fdaf9-logs" (OuterVolumeSpecName: "logs") pod "7cdbff37-e2f5-4972-b932-6b53278fdaf9" (UID: "7cdbff37-e2f5-4972-b932-6b53278fdaf9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.030196 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cdbff37-e2f5-4972-b932-6b53278fdaf9-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.040155 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cdbff37-e2f5-4972-b932-6b53278fdaf9-kube-api-access-gh9mc" (OuterVolumeSpecName: "kube-api-access-gh9mc") pod "7cdbff37-e2f5-4972-b932-6b53278fdaf9" (UID: "7cdbff37-e2f5-4972-b932-6b53278fdaf9"). InnerVolumeSpecName "kube-api-access-gh9mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.076077 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cdbff37-e2f5-4972-b932-6b53278fdaf9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7cdbff37-e2f5-4972-b932-6b53278fdaf9" (UID: "7cdbff37-e2f5-4972-b932-6b53278fdaf9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.080844 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-scripts" (OuterVolumeSpecName: "scripts") pod "7cdbff37-e2f5-4972-b932-6b53278fdaf9" (UID: "7cdbff37-e2f5-4972-b932-6b53278fdaf9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.110446 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerStarted","Data":"c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab"} Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.114430 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-config-data" (OuterVolumeSpecName: "config-data") pod "7cdbff37-e2f5-4972-b932-6b53278fdaf9" (UID: "7cdbff37-e2f5-4972-b932-6b53278fdaf9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.118403 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"10c6d4d0-3a47-4756-a5bb-65ff70e4c677","Type":"ContainerStarted","Data":"9d75ac9b7ce6f08c72f897f9980110f059f5f9ad7eb3b52f1365f18fb6963e00"} Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.126973 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54cccc9b7d-psd8k" event={"ID":"fd885701-3239-4926-8393-6e671e0f3b22","Type":"ContainerStarted","Data":"5ce3c5a45b36122465bd3d8f25359ac83011484774b6aa62e7be7efdc728db8b"} Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.132239 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.132264 4892 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7cdbff37-e2f5-4972-b932-6b53278fdaf9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.132275 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7cdbff37-e2f5-4972-b932-6b53278fdaf9-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.132285 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh9mc\" (UniqueName: \"kubernetes.io/projected/7cdbff37-e2f5-4972-b932-6b53278fdaf9-kube-api-access-gh9mc\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.171663 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=19.17162322 podStartE2EDuration="19.17162322s" podCreationTimestamp="2025-10-06 12:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:55.157071052 +0000 UTC m=+1101.706776817" watchObservedRunningTime="2025-10-06 12:26:55.17162322 +0000 UTC m=+1101.721328995" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.173532 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerStarted","Data":"67eb75004ff0daa385a53bd94c782def7df97e07fa88afcdc0b6ded5ebf7627a"} Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.188639 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78555bc94f-rgqkh"] Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.189837 4892 generic.go:334] "Generic (PLEG): container finished" podID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerID="48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32" exitCode=137 Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.189863 4892 generic.go:334] "Generic (PLEG): container finished" podID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerID="1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b" exitCode=137 Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.189881 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cb478ff49-q7wrl" event={"ID":"7cdbff37-e2f5-4972-b932-6b53278fdaf9","Type":"ContainerDied","Data":"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32"} Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.189901 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cb478ff49-q7wrl" event={"ID":"7cdbff37-e2f5-4972-b932-6b53278fdaf9","Type":"ContainerDied","Data":"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b"} Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.189912 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6cb478ff49-q7wrl" event={"ID":"7cdbff37-e2f5-4972-b932-6b53278fdaf9","Type":"ContainerDied","Data":"6325f6f109cb7e2b285db27d6cc06cb61a8b15a996fd7d7104bf3100d4256e6e"} Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.189928 4892 scope.go:117] "RemoveContainer" containerID="48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.190057 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6cb478ff49-q7wrl" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.200562 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-657b8d7546-4g6v6"] Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.215514 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6984f8c567-2sx4r"] Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.233429 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554467fdd7-n2trc"] Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.246921 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f964b9fb4-v4lhn"] Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.311943 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6cb478ff49-q7wrl"] Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.349030 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6cb478ff49-q7wrl"] Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.454606 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68f547b878-th9sd"] Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.467663 4892 scope.go:117] "RemoveContainer" containerID="1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.547887 4892 scope.go:117] "RemoveContainer" containerID="48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32" Oct 06 12:26:55 crc kubenswrapper[4892]: E1006 12:26:55.549859 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32\": container with ID starting with 48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32 not found: ID does not exist" containerID="48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.549955 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32"} err="failed to get container status \"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32\": rpc error: code = NotFound desc = could not find container \"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32\": container with ID starting with 48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32 not found: ID does not exist" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.550586 4892 scope.go:117] "RemoveContainer" containerID="1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b" Oct 06 12:26:55 crc kubenswrapper[4892]: E1006 12:26:55.556416 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b\": container with ID starting with 1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b not found: ID does not exist" containerID="1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.556480 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b"} err="failed to get container status \"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b\": rpc error: code = NotFound desc = could not find container \"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b\": container with ID starting with 1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b not found: ID does not exist" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.556513 4892 scope.go:117] "RemoveContainer" containerID="48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.557259 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32"} err="failed to get container status \"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32\": rpc error: code = NotFound desc = could not find container \"48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32\": container with ID starting with 48b7e48c87b269544a982e1af65399b5b52327cef8ff953b3dc586a033826d32 not found: ID does not exist" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.557300 4892 scope.go:117] "RemoveContainer" containerID="1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b" Oct 06 12:26:55 crc kubenswrapper[4892]: I1006 12:26:55.558455 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b"} err="failed to get container status \"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b\": rpc error: code = NotFound desc = could not find container \"1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b\": container with ID starting with 1144820ab9c4df28ab6e03f4de34548ff2adc5e01656c312d415acb26f0c202b not found: ID does not exist" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.019296 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.083860 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.097425 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-scripts\") pod \"70af3608-42f9-456b-9035-3030027e04ca\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.097481 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-config-data\") pod \"70af3608-42f9-456b-9035-3030027e04ca\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.097528 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70af3608-42f9-456b-9035-3030027e04ca-logs\") pod \"70af3608-42f9-456b-9035-3030027e04ca\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.097654 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz648\" (UniqueName: \"kubernetes.io/projected/70af3608-42f9-456b-9035-3030027e04ca-kube-api-access-pz648\") pod \"70af3608-42f9-456b-9035-3030027e04ca\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.097720 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70af3608-42f9-456b-9035-3030027e04ca-horizon-secret-key\") pod \"70af3608-42f9-456b-9035-3030027e04ca\" (UID: \"70af3608-42f9-456b-9035-3030027e04ca\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.099750 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70af3608-42f9-456b-9035-3030027e04ca-logs" (OuterVolumeSpecName: "logs") pod "70af3608-42f9-456b-9035-3030027e04ca" (UID: "70af3608-42f9-456b-9035-3030027e04ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.116540 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70af3608-42f9-456b-9035-3030027e04ca-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "70af3608-42f9-456b-9035-3030027e04ca" (UID: "70af3608-42f9-456b-9035-3030027e04ca"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.144524 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70af3608-42f9-456b-9035-3030027e04ca-kube-api-access-pz648" (OuterVolumeSpecName: "kube-api-access-pz648") pod "70af3608-42f9-456b-9035-3030027e04ca" (UID: "70af3608-42f9-456b-9035-3030027e04ca"). InnerVolumeSpecName "kube-api-access-pz648". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.174892 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-config-data" (OuterVolumeSpecName: "config-data") pod "70af3608-42f9-456b-9035-3030027e04ca" (UID: "70af3608-42f9-456b-9035-3030027e04ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.178140 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-scripts" (OuterVolumeSpecName: "scripts") pod "70af3608-42f9-456b-9035-3030027e04ca" (UID: "70af3608-42f9-456b-9035-3030027e04ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201139 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b783419-dc0f-4bac-84fd-043c68de8718-logs\") pod \"2b783419-dc0f-4bac-84fd-043c68de8718\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201245 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b783419-dc0f-4bac-84fd-043c68de8718-horizon-secret-key\") pod \"2b783419-dc0f-4bac-84fd-043c68de8718\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201273 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95dhd\" (UniqueName: \"kubernetes.io/projected/2b783419-dc0f-4bac-84fd-043c68de8718-kube-api-access-95dhd\") pod \"2b783419-dc0f-4bac-84fd-043c68de8718\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201311 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-config-data\") pod \"2b783419-dc0f-4bac-84fd-043c68de8718\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201370 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-scripts\") pod \"2b783419-dc0f-4bac-84fd-043c68de8718\" (UID: \"2b783419-dc0f-4bac-84fd-043c68de8718\") " Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201634 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b783419-dc0f-4bac-84fd-043c68de8718-logs" (OuterVolumeSpecName: "logs") pod "2b783419-dc0f-4bac-84fd-043c68de8718" (UID: "2b783419-dc0f-4bac-84fd-043c68de8718"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201904 4892 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/70af3608-42f9-456b-9035-3030027e04ca-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201921 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201930 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b783419-dc0f-4bac-84fd-043c68de8718-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201939 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/70af3608-42f9-456b-9035-3030027e04ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201948 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70af3608-42f9-456b-9035-3030027e04ca-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.201956 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz648\" (UniqueName: \"kubernetes.io/projected/70af3608-42f9-456b-9035-3030027e04ca-kube-api-access-pz648\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.208584 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b783419-dc0f-4bac-84fd-043c68de8718-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2b783419-dc0f-4bac-84fd-043c68de8718" (UID: "2b783419-dc0f-4bac-84fd-043c68de8718"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.211584 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b783419-dc0f-4bac-84fd-043c68de8718-kube-api-access-95dhd" (OuterVolumeSpecName: "kube-api-access-95dhd") pod "2b783419-dc0f-4bac-84fd-043c68de8718" (UID: "2b783419-dc0f-4bac-84fd-043c68de8718"). InnerVolumeSpecName "kube-api-access-95dhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.216159 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" path="/var/lib/kubelet/pods/7cdbff37-e2f5-4972-b932-6b53278fdaf9/volumes" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.235935 4892 generic.go:334] "Generic (PLEG): container finished" podID="70af3608-42f9-456b-9035-3030027e04ca" containerID="19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d" exitCode=137 Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.235973 4892 generic.go:334] "Generic (PLEG): container finished" podID="70af3608-42f9-456b-9035-3030027e04ca" containerID="e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce" exitCode=137 Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.236022 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976c9f5c7-4g42j" event={"ID":"70af3608-42f9-456b-9035-3030027e04ca","Type":"ContainerDied","Data":"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.236046 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976c9f5c7-4g42j" event={"ID":"70af3608-42f9-456b-9035-3030027e04ca","Type":"ContainerDied","Data":"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.236059 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7976c9f5c7-4g42j" event={"ID":"70af3608-42f9-456b-9035-3030027e04ca","Type":"ContainerDied","Data":"83ed8ef824a448373ed36d71066d4a81b25a7cf92f35a7239cc4b2514d126824"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.236079 4892 scope.go:117] "RemoveContainer" containerID="19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.236199 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.246979 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657b8d7546-4g6v6" event={"ID":"0e60546e-29f5-4c31-9124-1ee0f28340e8","Type":"ContainerStarted","Data":"f4cc11332c8a54bb3076da3365934c84efc90102ea95e556ea3fa7cb3182bc83"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.255454 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f547b878-th9sd" event={"ID":"c09ee4a5-59be-4a2c-8701-a572cc9a71ec","Type":"ContainerStarted","Data":"bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.255503 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f547b878-th9sd" event={"ID":"c09ee4a5-59be-4a2c-8701-a572cc9a71ec","Type":"ContainerStarted","Data":"506c8d5bf728c9326e92e31e86810cc723543100204fc5b35f67e756b96bff03"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.266135 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-scripts" (OuterVolumeSpecName: "scripts") pod "2b783419-dc0f-4bac-84fd-043c68de8718" (UID: "2b783419-dc0f-4bac-84fd-043c68de8718"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.267365 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-config-data" (OuterVolumeSpecName: "config-data") pod "2b783419-dc0f-4bac-84fd-043c68de8718" (UID: "2b783419-dc0f-4bac-84fd-043c68de8718"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.270557 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554467fdd7-n2trc" event={"ID":"b93078cc-28b0-4b5f-a4f9-af91631acd25","Type":"ContainerStarted","Data":"bc9ac29fecdd4a78a3697849d76f9be933577f4742008663afc9ec964c4f8e52"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.276501 4892 generic.go:334] "Generic (PLEG): container finished" podID="2b783419-dc0f-4bac-84fd-043c68de8718" containerID="8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870" exitCode=137 Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.276532 4892 generic.go:334] "Generic (PLEG): container finished" podID="2b783419-dc0f-4bac-84fd-043c68de8718" containerID="f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6" exitCode=137 Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.276572 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686dd487d5-78rg9" event={"ID":"2b783419-dc0f-4bac-84fd-043c68de8718","Type":"ContainerDied","Data":"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.276598 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686dd487d5-78rg9" event={"ID":"2b783419-dc0f-4bac-84fd-043c68de8718","Type":"ContainerDied","Data":"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.276608 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686dd487d5-78rg9" event={"ID":"2b783419-dc0f-4bac-84fd-043c68de8718","Type":"ContainerDied","Data":"525d33dcdf230d0b67fa6b91110498ba0861d1d71e4b28b1614169258a1cb64e"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.276661 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686dd487d5-78rg9" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.291053 4892 generic.go:334] "Generic (PLEG): container finished" podID="cac926c3-363d-4d97-ad7a-aab9960020ea" containerID="dcf966df5f31628b89240fac42576421acece51a4aed8d4aa9a6cd0a14fcdcba" exitCode=0 Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.291189 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" event={"ID":"cac926c3-363d-4d97-ad7a-aab9960020ea","Type":"ContainerDied","Data":"dcf966df5f31628b89240fac42576421acece51a4aed8d4aa9a6cd0a14fcdcba"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.291261 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" event={"ID":"cac926c3-363d-4d97-ad7a-aab9960020ea","Type":"ContainerStarted","Data":"cadc9025aab3587f956745adc3c95e27f7519b9bb499e1ca834941f520c781f1"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.299526 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6984f8c567-2sx4r" event={"ID":"72c02562-a51b-42c5-8797-fe351a4932f7","Type":"ContainerStarted","Data":"b11552b1cda283eda853b321dac3bfb26d48ba86f6b47e5e6e3677c66df801b6"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.303715 4892 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2b783419-dc0f-4bac-84fd-043c68de8718-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.303739 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95dhd\" (UniqueName: \"kubernetes.io/projected/2b783419-dc0f-4bac-84fd-043c68de8718-kube-api-access-95dhd\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.303750 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.303760 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b783419-dc0f-4bac-84fd-043c68de8718-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.311330 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" event={"ID":"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a","Type":"ContainerStarted","Data":"32ea35bb4fb1ecdd02cebfc85b1af7d594f6beb5cba4c1b09994e28fe016bfb2"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.348286 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54cccc9b7d-psd8k" event={"ID":"fd885701-3239-4926-8393-6e671e0f3b22","Type":"ContainerStarted","Data":"a7ed9aecfff24f2352004cab837c688ceb61b74c23292b2e8296f847ef53affc"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.348723 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-54cccc9b7d-psd8k" event={"ID":"fd885701-3239-4926-8393-6e671e0f3b22","Type":"ContainerStarted","Data":"2a786ff7d72775b2684579414b5a455dc05c9868c62624653d29b6d2202c3962"} Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.348738 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.349810 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.390413 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-686dd487d5-78rg9"] Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.415823 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-54cccc9b7d-psd8k" podStartSLOduration=12.415805892 podStartE2EDuration="12.415805892s" podCreationTimestamp="2025-10-06 12:26:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:56.391858312 +0000 UTC m=+1102.941564067" watchObservedRunningTime="2025-10-06 12:26:56.415805892 +0000 UTC m=+1102.965511657" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.419482 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-686dd487d5-78rg9"] Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.558520 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-696599f45c-ndwj5"] Oct 06 12:26:56 crc kubenswrapper[4892]: E1006 12:26:56.558900 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70af3608-42f9-456b-9035-3030027e04ca" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.558912 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="70af3608-42f9-456b-9035-3030027e04ca" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: E1006 12:26:56.558923 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.558929 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: E1006 12:26:56.558957 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.558965 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: E1006 12:26:56.558984 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.558990 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: E1006 12:26:56.559001 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.559008 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: E1006 12:26:56.559022 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70af3608-42f9-456b-9035-3030027e04ca" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.559028 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="70af3608-42f9-456b-9035-3030027e04ca" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.559186 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.559199 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.559218 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="70af3608-42f9-456b-9035-3030027e04ca" containerName="horizon-log" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.559231 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="70af3608-42f9-456b-9035-3030027e04ca" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.559246 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cdbff37-e2f5-4972-b932-6b53278fdaf9" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.559253 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" containerName="horizon" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.560278 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.564716 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.564884 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.585232 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-696599f45c-ndwj5"] Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.608723 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2qnb\" (UniqueName: \"kubernetes.io/projected/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-kube-api-access-s2qnb\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.608754 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-httpd-config\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.608822 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-config\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.608873 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-internal-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.608889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-public-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.608912 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-ovndb-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.608950 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-combined-ca-bundle\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.625368 4892 scope.go:117] "RemoveContainer" containerID="e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.711520 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-config\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.711803 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-internal-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.711824 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-public-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.711848 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-ovndb-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.711889 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-combined-ca-bundle\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.711930 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2qnb\" (UniqueName: \"kubernetes.io/projected/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-kube-api-access-s2qnb\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.711948 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-httpd-config\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.717210 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-httpd-config\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.719051 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-config\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.720890 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-ovndb-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.722259 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-public-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.734018 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-internal-tls-certs\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.746834 4892 scope.go:117] "RemoveContainer" containerID="19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.755411 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-combined-ca-bundle\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.774864 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7489d9984-82d5x" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.167:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.167:8443: connect: connection refused" Oct 06 12:26:56 crc kubenswrapper[4892]: E1006 12:26:56.775158 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d\": container with ID starting with 19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d not found: ID does not exist" containerID="19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.775185 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d"} err="failed to get container status \"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d\": rpc error: code = NotFound desc = could not find container \"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d\": container with ID starting with 19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d not found: ID does not exist" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.775207 4892 scope.go:117] "RemoveContainer" containerID="e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce" Oct 06 12:26:56 crc kubenswrapper[4892]: E1006 12:26:56.775967 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce\": container with ID starting with e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce not found: ID does not exist" containerID="e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.775994 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce"} err="failed to get container status \"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce\": rpc error: code = NotFound desc = could not find container \"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce\": container with ID starting with e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce not found: ID does not exist" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.776007 4892 scope.go:117] "RemoveContainer" containerID="19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.776815 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d"} err="failed to get container status \"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d\": rpc error: code = NotFound desc = could not find container \"19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d\": container with ID starting with 19c0bf3d16b2c0ec7c48a067026ba414ab58a8eb88b21ee475ec1697b260051d not found: ID does not exist" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.776842 4892 scope.go:117] "RemoveContainer" containerID="e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.778534 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2qnb\" (UniqueName: \"kubernetes.io/projected/7ee89f5c-ce92-4b17-82b5-7b8be64fa649-kube-api-access-s2qnb\") pod \"neutron-696599f45c-ndwj5\" (UID: \"7ee89f5c-ce92-4b17-82b5-7b8be64fa649\") " pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.792892 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce"} err="failed to get container status \"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce\": rpc error: code = NotFound desc = could not find container \"e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce\": container with ID starting with e9d8ad835ca2c6a8014313d326dd70e692f48ab166209f7256889dada46318ce not found: ID does not exist" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.792926 4892 scope.go:117] "RemoveContainer" containerID="8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.900592 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:26:56 crc kubenswrapper[4892]: I1006 12:26:56.912314 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.036457 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-sb\") pod \"b93078cc-28b0-4b5f-a4f9-af91631acd25\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.036600 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2g6t\" (UniqueName: \"kubernetes.io/projected/b93078cc-28b0-4b5f-a4f9-af91631acd25-kube-api-access-v2g6t\") pod \"b93078cc-28b0-4b5f-a4f9-af91631acd25\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.036634 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-config\") pod \"b93078cc-28b0-4b5f-a4f9-af91631acd25\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.036680 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-swift-storage-0\") pod \"b93078cc-28b0-4b5f-a4f9-af91631acd25\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.036762 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-svc\") pod \"b93078cc-28b0-4b5f-a4f9-af91631acd25\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.036850 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-nb\") pod \"b93078cc-28b0-4b5f-a4f9-af91631acd25\" (UID: \"b93078cc-28b0-4b5f-a4f9-af91631acd25\") " Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.052587 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93078cc-28b0-4b5f-a4f9-af91631acd25-kube-api-access-v2g6t" (OuterVolumeSpecName: "kube-api-access-v2g6t") pod "b93078cc-28b0-4b5f-a4f9-af91631acd25" (UID: "b93078cc-28b0-4b5f-a4f9-af91631acd25"). InnerVolumeSpecName "kube-api-access-v2g6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.062724 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b93078cc-28b0-4b5f-a4f9-af91631acd25" (UID: "b93078cc-28b0-4b5f-a4f9-af91631acd25"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.073212 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b93078cc-28b0-4b5f-a4f9-af91631acd25" (UID: "b93078cc-28b0-4b5f-a4f9-af91631acd25"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.078717 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-config" (OuterVolumeSpecName: "config") pod "b93078cc-28b0-4b5f-a4f9-af91631acd25" (UID: "b93078cc-28b0-4b5f-a4f9-af91631acd25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.089122 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b93078cc-28b0-4b5f-a4f9-af91631acd25" (UID: "b93078cc-28b0-4b5f-a4f9-af91631acd25"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.093330 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b93078cc-28b0-4b5f-a4f9-af91631acd25" (UID: "b93078cc-28b0-4b5f-a4f9-af91631acd25"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.139822 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2g6t\" (UniqueName: \"kubernetes.io/projected/b93078cc-28b0-4b5f-a4f9-af91631acd25-kube-api-access-v2g6t\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.139872 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.139887 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.139899 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.140114 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.140125 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b93078cc-28b0-4b5f-a4f9-af91631acd25-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.151469 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.151627 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.198556 4892 scope.go:117] "RemoveContainer" containerID="f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.207503 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.389125 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" event={"ID":"cac926c3-363d-4d97-ad7a-aab9960020ea","Type":"ContainerStarted","Data":"8c04c36214f42adb11633d352a79a289d237e7b151efdb3764be56ff6b870e37"} Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.390422 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.405666 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657b8d7546-4g6v6" event={"ID":"0e60546e-29f5-4c31-9124-1ee0f28340e8","Type":"ContainerStarted","Data":"49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2"} Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.405709 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657b8d7546-4g6v6" event={"ID":"0e60546e-29f5-4c31-9124-1ee0f28340e8","Type":"ContainerStarted","Data":"1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e"} Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.406425 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.406468 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.409487 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f547b878-th9sd" event={"ID":"c09ee4a5-59be-4a2c-8701-a572cc9a71ec","Type":"ContainerStarted","Data":"8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8"} Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.410317 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.418346 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" podStartSLOduration=4.418331244 podStartE2EDuration="4.418331244s" podCreationTimestamp="2025-10-06 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:57.414616902 +0000 UTC m=+1103.964322667" watchObservedRunningTime="2025-10-06 12:26:57.418331244 +0000 UTC m=+1103.968036999" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.424330 4892 generic.go:334] "Generic (PLEG): container finished" podID="b93078cc-28b0-4b5f-a4f9-af91631acd25" containerID="c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b" exitCode=0 Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.425190 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-554467fdd7-n2trc" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.425405 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554467fdd7-n2trc" event={"ID":"b93078cc-28b0-4b5f-a4f9-af91631acd25","Type":"ContainerDied","Data":"c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b"} Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.425449 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-554467fdd7-n2trc" event={"ID":"b93078cc-28b0-4b5f-a4f9-af91631acd25","Type":"ContainerDied","Data":"bc9ac29fecdd4a78a3697849d76f9be933577f4742008663afc9ec964c4f8e52"} Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.439842 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68f547b878-th9sd" podStartSLOduration=4.43980527 podStartE2EDuration="4.43980527s" podCreationTimestamp="2025-10-06 12:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:57.436412948 +0000 UTC m=+1103.986118713" watchObservedRunningTime="2025-10-06 12:26:57.43980527 +0000 UTC m=+1103.989511035" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.477740 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-657b8d7546-4g6v6" podStartSLOduration=15.477718591 podStartE2EDuration="15.477718591s" podCreationTimestamp="2025-10-06 12:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:26:57.462774131 +0000 UTC m=+1104.012479896" watchObservedRunningTime="2025-10-06 12:26:57.477718591 +0000 UTC m=+1104.027424346" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.532670 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.582620 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-554467fdd7-n2trc"] Oct 06 12:26:57 crc kubenswrapper[4892]: I1006 12:26:57.591418 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-554467fdd7-n2trc"] Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.056029 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.100050 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.190851 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b783419-dc0f-4bac-84fd-043c68de8718" path="/var/lib/kubelet/pods/2b783419-dc0f-4bac-84fd-043c68de8718/volumes" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.202452 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b93078cc-28b0-4b5f-a4f9-af91631acd25" path="/var/lib/kubelet/pods/b93078cc-28b0-4b5f-a4f9-af91631acd25/volumes" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.243271 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.243351 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.407456 4892 scope.go:117] "RemoveContainer" containerID="8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870" Oct 06 12:26:58 crc kubenswrapper[4892]: E1006 12:26:58.407935 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870\": container with ID starting with 8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870 not found: ID does not exist" containerID="8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.407962 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870"} err="failed to get container status \"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870\": rpc error: code = NotFound desc = could not find container \"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870\": container with ID starting with 8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870 not found: ID does not exist" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.407982 4892 scope.go:117] "RemoveContainer" containerID="f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6" Oct 06 12:26:58 crc kubenswrapper[4892]: E1006 12:26:58.408386 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6\": container with ID starting with f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6 not found: ID does not exist" containerID="f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.408589 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6"} err="failed to get container status \"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6\": rpc error: code = NotFound desc = could not find container \"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6\": container with ID starting with f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6 not found: ID does not exist" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.408628 4892 scope.go:117] "RemoveContainer" containerID="8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.408897 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870"} err="failed to get container status \"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870\": rpc error: code = NotFound desc = could not find container \"8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870\": container with ID starting with 8c123e19dd7d5cab1f27e8051fe920fef89b9b83f5ed7f3cecc782dfb2833870 not found: ID does not exist" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.408916 4892 scope.go:117] "RemoveContainer" containerID="f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.409117 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6"} err="failed to get container status \"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6\": rpc error: code = NotFound desc = could not find container \"f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6\": container with ID starting with f877c394e9af9213dcd5d024aaa3955e3a1c45563fc5af40974e5924d7dfa2c6 not found: ID does not exist" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.409144 4892 scope.go:117] "RemoveContainer" containerID="c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.444171 4892 generic.go:334] "Generic (PLEG): container finished" podID="6002d110-e634-47ab-b33b-652cbf7b3466" containerID="c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab" exitCode=1 Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.444238 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerDied","Data":"c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab"} Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.445123 4892 scope.go:117] "RemoveContainer" containerID="c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab" Oct 06 12:26:58 crc kubenswrapper[4892]: E1006 12:26:58.445515 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6002d110-e634-47ab-b33b-652cbf7b3466)\"" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.764986 4892 scope.go:117] "RemoveContainer" containerID="c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b" Oct 06 12:26:58 crc kubenswrapper[4892]: E1006 12:26:58.765820 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b\": container with ID starting with c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b not found: ID does not exist" containerID="c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.765863 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b"} err="failed to get container status \"c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b\": rpc error: code = NotFound desc = could not find container \"c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b\": container with ID starting with c462b9d4ad4ae0aa3f8be861414bc9a19f8d35a250963b926764d9abf773a11b not found: ID does not exist" Oct 06 12:26:58 crc kubenswrapper[4892]: I1006 12:26:58.765891 4892 scope.go:117] "RemoveContainer" containerID="496cfbd96e1426e4f91444daf5b7a3fa29a01be422c2afa18abcada3aeaeeba8" Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.335433 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-696599f45c-ndwj5"] Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.461687 4892 scope.go:117] "RemoveContainer" containerID="c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab" Oct 06 12:26:59 crc kubenswrapper[4892]: E1006 12:26:59.462923 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6002d110-e634-47ab-b33b-652cbf7b3466)\"" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.467244 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6984f8c567-2sx4r" event={"ID":"72c02562-a51b-42c5-8797-fe351a4932f7","Type":"ContainerStarted","Data":"e7c804a1cf64a37f47a64721c06c897fecc871861d326f0ce803558bfe4a881b"} Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.467279 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6984f8c567-2sx4r" event={"ID":"72c02562-a51b-42c5-8797-fe351a4932f7","Type":"ContainerStarted","Data":"3a1a1342c1ed8add5e0724e6bc32cd1edd3609f387d45440d2cbdfb297374436"} Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.469565 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-696599f45c-ndwj5" event={"ID":"7ee89f5c-ce92-4b17-82b5-7b8be64fa649","Type":"ContainerStarted","Data":"9f86125eecbee78cd2026a578cd2c26a7957a06160d8c7d70b6d5a7a4f57ada7"} Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.489079 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" event={"ID":"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a","Type":"ContainerStarted","Data":"8afec3eeaf8c5e013f8b56aa4ea3ee22a2e266e9c8fcc5609722122c13896289"} Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.489161 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" event={"ID":"eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a","Type":"ContainerStarted","Data":"fbdaf97903e9fdd5b72af273b44209e18db4ca217110b7735d904ff82c7af5fa"} Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.506830 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6984f8c567-2sx4r" podStartSLOduration=13.87554352 podStartE2EDuration="17.506789668s" podCreationTimestamp="2025-10-06 12:26:42 +0000 UTC" firstStartedPulling="2025-10-06 12:26:55.208751767 +0000 UTC m=+1101.758457532" lastFinishedPulling="2025-10-06 12:26:58.839997915 +0000 UTC m=+1105.389703680" observedRunningTime="2025-10-06 12:26:59.495556171 +0000 UTC m=+1106.045261936" watchObservedRunningTime="2025-10-06 12:26:59.506789668 +0000 UTC m=+1106.056495423" Oct 06 12:26:59 crc kubenswrapper[4892]: I1006 12:26:59.526983 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5f964b9fb4-v4lhn" podStartSLOduration=14.162165543 podStartE2EDuration="17.526959224s" podCreationTimestamp="2025-10-06 12:26:42 +0000 UTC" firstStartedPulling="2025-10-06 12:26:55.475113811 +0000 UTC m=+1102.024819576" lastFinishedPulling="2025-10-06 12:26:58.839907492 +0000 UTC m=+1105.389613257" observedRunningTime="2025-10-06 12:26:59.512123303 +0000 UTC m=+1106.061829068" watchObservedRunningTime="2025-10-06 12:26:59.526959224 +0000 UTC m=+1106.076664989" Oct 06 12:27:00 crc kubenswrapper[4892]: I1006 12:27:00.518873 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-696599f45c-ndwj5" event={"ID":"7ee89f5c-ce92-4b17-82b5-7b8be64fa649","Type":"ContainerStarted","Data":"e5c9d0ec33a778e14d4bc53c2f311c2da73e26139bcad1aee65629fbe86313f6"} Oct 06 12:27:00 crc kubenswrapper[4892]: I1006 12:27:00.520994 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:27:00 crc kubenswrapper[4892]: I1006 12:27:00.521140 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-696599f45c-ndwj5" event={"ID":"7ee89f5c-ce92-4b17-82b5-7b8be64fa649","Type":"ContainerStarted","Data":"d940a38689d10e1e2da116f1c7575cdffe47af33744f84515a67929c505b8947"} Oct 06 12:27:00 crc kubenswrapper[4892]: I1006 12:27:00.551861 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-696599f45c-ndwj5" podStartSLOduration=4.551844647 podStartE2EDuration="4.551844647s" podCreationTimestamp="2025-10-06 12:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:00.542091404 +0000 UTC m=+1107.091797169" watchObservedRunningTime="2025-10-06 12:27:00.551844647 +0000 UTC m=+1107.101550412" Oct 06 12:27:04 crc kubenswrapper[4892]: I1006 12:27:04.044578 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:27:04 crc kubenswrapper[4892]: I1006 12:27:04.139946 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c68f45bbf-65dm9"] Oct 06 12:27:04 crc kubenswrapper[4892]: I1006 12:27:04.140170 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" podUID="d6990a45-0f43-40a0-9c44-54e026d3acd2" containerName="dnsmasq-dns" containerID="cri-o://aa831f1d1cc18eaad85f24e4ffb6f6cda2fba4aad778ae21f357a85050e812fd" gracePeriod=10 Oct 06 12:27:04 crc kubenswrapper[4892]: I1006 12:27:04.443952 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:27:04 crc kubenswrapper[4892]: I1006 12:27:04.557425 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:27:04 crc kubenswrapper[4892]: I1006 12:27:04.591473 4892 generic.go:334] "Generic (PLEG): container finished" podID="d6990a45-0f43-40a0-9c44-54e026d3acd2" containerID="aa831f1d1cc18eaad85f24e4ffb6f6cda2fba4aad778ae21f357a85050e812fd" exitCode=0 Oct 06 12:27:04 crc kubenswrapper[4892]: I1006 12:27:04.592342 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" event={"ID":"d6990a45-0f43-40a0-9c44-54e026d3acd2","Type":"ContainerDied","Data":"aa831f1d1cc18eaad85f24e4ffb6f6cda2fba4aad778ae21f357a85050e812fd"} Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.018241 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.066303 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.095965 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-586dc66ccd-lmkkz" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.112604 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-svc\") pod \"d6990a45-0f43-40a0-9c44-54e026d3acd2\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.112666 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-swift-storage-0\") pod \"d6990a45-0f43-40a0-9c44-54e026d3acd2\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.112731 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-sb\") pod \"d6990a45-0f43-40a0-9c44-54e026d3acd2\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.112778 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-nb\") pod \"d6990a45-0f43-40a0-9c44-54e026d3acd2\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.112819 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-config\") pod \"d6990a45-0f43-40a0-9c44-54e026d3acd2\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.112839 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs8dn\" (UniqueName: \"kubernetes.io/projected/d6990a45-0f43-40a0-9c44-54e026d3acd2-kube-api-access-fs8dn\") pod \"d6990a45-0f43-40a0-9c44-54e026d3acd2\" (UID: \"d6990a45-0f43-40a0-9c44-54e026d3acd2\") " Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.138857 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6990a45-0f43-40a0-9c44-54e026d3acd2-kube-api-access-fs8dn" (OuterVolumeSpecName: "kube-api-access-fs8dn") pod "d6990a45-0f43-40a0-9c44-54e026d3acd2" (UID: "d6990a45-0f43-40a0-9c44-54e026d3acd2"). InnerVolumeSpecName "kube-api-access-fs8dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.214937 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs8dn\" (UniqueName: \"kubernetes.io/projected/d6990a45-0f43-40a0-9c44-54e026d3acd2-kube-api-access-fs8dn\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.216875 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6990a45-0f43-40a0-9c44-54e026d3acd2" (UID: "d6990a45-0f43-40a0-9c44-54e026d3acd2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.220989 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-config" (OuterVolumeSpecName: "config") pod "d6990a45-0f43-40a0-9c44-54e026d3acd2" (UID: "d6990a45-0f43-40a0-9c44-54e026d3acd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.233914 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d6990a45-0f43-40a0-9c44-54e026d3acd2" (UID: "d6990a45-0f43-40a0-9c44-54e026d3acd2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.234227 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d6990a45-0f43-40a0-9c44-54e026d3acd2" (UID: "d6990a45-0f43-40a0-9c44-54e026d3acd2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.240150 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d6990a45-0f43-40a0-9c44-54e026d3acd2" (UID: "d6990a45-0f43-40a0-9c44-54e026d3acd2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.319943 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.319975 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.319987 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.319997 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.320005 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6990a45-0f43-40a0-9c44-54e026d3acd2-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.515277 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.616108 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" event={"ID":"d6990a45-0f43-40a0-9c44-54e026d3acd2","Type":"ContainerDied","Data":"c6f11bde89413407e738e9ddd15dba55911783aa6da7199f5e53ad030f18d2e6"} Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.616461 4892 scope.go:117] "RemoveContainer" containerID="aa831f1d1cc18eaad85f24e4ffb6f6cda2fba4aad778ae21f357a85050e812fd" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.616586 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c68f45bbf-65dm9" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.620961 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="ceilometer-central-agent" containerID="cri-o://e6539d1e7544ee29dc9c8e9e92bb8c80e741283532258168d0521522155d4d52" gracePeriod=30 Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.621030 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerStarted","Data":"f104535ba92db431ddc5ae339990218e53641b7b04d8b3aa3f16e3a78a526eeb"} Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.621063 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.621345 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="proxy-httpd" containerID="cri-o://f104535ba92db431ddc5ae339990218e53641b7b04d8b3aa3f16e3a78a526eeb" gracePeriod=30 Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.621403 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="sg-core" containerID="cri-o://67eb75004ff0daa385a53bd94c782def7df97e07fa88afcdc0b6ded5ebf7627a" gracePeriod=30 Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.621443 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="ceilometer-notification-agent" containerID="cri-o://2e0bc68cd17eb8cdf12f45ea8fa27aaae9d3ddd95d93952eb9f3534f362600b0" gracePeriod=30 Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.654227 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-54cccc9b7d-psd8k" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.655238 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.206744924 podStartE2EDuration="59.655227079s" podCreationTimestamp="2025-10-06 12:26:07 +0000 UTC" firstStartedPulling="2025-10-06 12:26:10.22343637 +0000 UTC m=+1056.773142135" lastFinishedPulling="2025-10-06 12:27:05.671918525 +0000 UTC m=+1112.221624290" observedRunningTime="2025-10-06 12:27:06.655041794 +0000 UTC m=+1113.204747559" watchObservedRunningTime="2025-10-06 12:27:06.655227079 +0000 UTC m=+1113.204932844" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.678380 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c68f45bbf-65dm9"] Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.696656 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c68f45bbf-65dm9"] Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.751434 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-657b8d7546-4g6v6"] Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.751628 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-657b8d7546-4g6v6" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api-log" containerID="cri-o://49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2" gracePeriod=30 Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.751738 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-657b8d7546-4g6v6" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api" containerID="cri-o://1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e" gracePeriod=30 Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.757905 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-657b8d7546-4g6v6" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": EOF" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.758020 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-657b8d7546-4g6v6" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": EOF" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.775122 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7489d9984-82d5x" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.167:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.167:8443: connect: connection refused" Oct 06 12:27:06 crc kubenswrapper[4892]: I1006 12:27:06.951587 4892 scope.go:117] "RemoveContainer" containerID="c9d275bdc47003ae34b837f14357b7041049010369b6b360a1e4daa397ed3f91" Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.633877 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rk9bz" event={"ID":"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea","Type":"ContainerStarted","Data":"94ffbba9af6b2f85ea4ab925ef340ab9848563be2d1fd8d14327157208878515"} Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.638723 4892 generic.go:334] "Generic (PLEG): container finished" podID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerID="49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2" exitCode=143 Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.638810 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657b8d7546-4g6v6" event={"ID":"0e60546e-29f5-4c31-9124-1ee0f28340e8","Type":"ContainerDied","Data":"49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2"} Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.654804 4892 generic.go:334] "Generic (PLEG): container finished" podID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerID="f104535ba92db431ddc5ae339990218e53641b7b04d8b3aa3f16e3a78a526eeb" exitCode=0 Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.654842 4892 generic.go:334] "Generic (PLEG): container finished" podID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerID="67eb75004ff0daa385a53bd94c782def7df97e07fa88afcdc0b6ded5ebf7627a" exitCode=2 Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.654851 4892 generic.go:334] "Generic (PLEG): container finished" podID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerID="e6539d1e7544ee29dc9c8e9e92bb8c80e741283532258168d0521522155d4d52" exitCode=0 Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.654876 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerDied","Data":"f104535ba92db431ddc5ae339990218e53641b7b04d8b3aa3f16e3a78a526eeb"} Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.654906 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerDied","Data":"67eb75004ff0daa385a53bd94c782def7df97e07fa88afcdc0b6ded5ebf7627a"} Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.654921 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerDied","Data":"e6539d1e7544ee29dc9c8e9e92bb8c80e741283532258168d0521522155d4d52"} Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.677428 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rk9bz" podStartSLOduration=13.4293555 podStartE2EDuration="56.677409634s" podCreationTimestamp="2025-10-06 12:26:11 +0000 UTC" firstStartedPulling="2025-10-06 12:26:22.98009271 +0000 UTC m=+1069.529798475" lastFinishedPulling="2025-10-06 12:27:06.228146844 +0000 UTC m=+1112.777852609" observedRunningTime="2025-10-06 12:27:07.67074845 +0000 UTC m=+1114.220454215" watchObservedRunningTime="2025-10-06 12:27:07.677409634 +0000 UTC m=+1114.227115399" Oct 06 12:27:07 crc kubenswrapper[4892]: I1006 12:27:07.878281 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5bd5b56c96-qhdwx" Oct 06 12:27:08 crc kubenswrapper[4892]: I1006 12:27:08.057464 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:27:08 crc kubenswrapper[4892]: I1006 12:27:08.057866 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:27:08 crc kubenswrapper[4892]: I1006 12:27:08.057933 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 12:27:08 crc kubenswrapper[4892]: I1006 12:27:08.058173 4892 scope.go:117] "RemoveContainer" containerID="c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab" Oct 06 12:27:08 crc kubenswrapper[4892]: E1006 12:27:08.058430 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6002d110-e634-47ab-b33b-652cbf7b3466)\"" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" Oct 06 12:27:08 crc kubenswrapper[4892]: I1006 12:27:08.179922 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6990a45-0f43-40a0-9c44-54e026d3acd2" path="/var/lib/kubelet/pods/d6990a45-0f43-40a0-9c44-54e026d3acd2/volumes" Oct 06 12:27:08 crc kubenswrapper[4892]: I1006 12:27:08.667251 4892 scope.go:117] "RemoveContainer" containerID="c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab" Oct 06 12:27:08 crc kubenswrapper[4892]: E1006 12:27:08.667883 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(6002d110-e634-47ab-b33b-652cbf7b3466)\"" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.620978 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 12:27:09 crc kubenswrapper[4892]: E1006 12:27:09.621376 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6990a45-0f43-40a0-9c44-54e026d3acd2" containerName="init" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.621391 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6990a45-0f43-40a0-9c44-54e026d3acd2" containerName="init" Oct 06 12:27:09 crc kubenswrapper[4892]: E1006 12:27:09.621433 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93078cc-28b0-4b5f-a4f9-af91631acd25" containerName="init" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.621440 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93078cc-28b0-4b5f-a4f9-af91631acd25" containerName="init" Oct 06 12:27:09 crc kubenswrapper[4892]: E1006 12:27:09.621455 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6990a45-0f43-40a0-9c44-54e026d3acd2" containerName="dnsmasq-dns" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.621461 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6990a45-0f43-40a0-9c44-54e026d3acd2" containerName="dnsmasq-dns" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.621631 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93078cc-28b0-4b5f-a4f9-af91631acd25" containerName="init" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.621648 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6990a45-0f43-40a0-9c44-54e026d3acd2" containerName="dnsmasq-dns" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.622278 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.627656 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.627824 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-kcqlm" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.627913 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.632842 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.688567 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.688908 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.689035 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs879\" (UniqueName: \"kubernetes.io/projected/5d87697a-3512-4f25-8a43-e3267c566051-kube-api-access-gs879\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.689066 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.709509 4892 generic.go:334] "Generic (PLEG): container finished" podID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerID="2e0bc68cd17eb8cdf12f45ea8fa27aaae9d3ddd95d93952eb9f3534f362600b0" exitCode=0 Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.709567 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerDied","Data":"2e0bc68cd17eb8cdf12f45ea8fa27aaae9d3ddd95d93952eb9f3534f362600b0"} Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.790634 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.790677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.790943 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs879\" (UniqueName: \"kubernetes.io/projected/5d87697a-3512-4f25-8a43-e3267c566051-kube-api-access-gs879\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.790973 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.791704 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.796098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config-secret\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.796917 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.807705 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs879\" (UniqueName: \"kubernetes.io/projected/5d87697a-3512-4f25-8a43-e3267c566051-kube-api-access-gs879\") pod \"openstackclient\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.874964 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-657b8d7546-4g6v6" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:51552->10.217.0.179:9311: read: connection reset by peer" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.878562 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-657b8d7546-4g6v6" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.179:9311/healthcheck\": read tcp 10.217.0.2:51564->10.217.0.179:9311: read: connection reset by peer" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.890107 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.948899 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.949656 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.958163 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.995354 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-log-httpd\") pod \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.995463 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-sg-core-conf-yaml\") pod \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.995500 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-config-data\") pod \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.995526 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-combined-ca-bundle\") pod \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.995562 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-run-httpd\") pod \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.995621 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh2hz\" (UniqueName: \"kubernetes.io/projected/f75c5afb-b292-41c0-9e77-b9dc84f38b45-kube-api-access-kh2hz\") pod \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.995677 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-scripts\") pod \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\" (UID: \"f75c5afb-b292-41c0-9e77-b9dc84f38b45\") " Oct 06 12:27:09 crc kubenswrapper[4892]: I1006 12:27:09.996418 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f75c5afb-b292-41c0-9e77-b9dc84f38b45" (UID: "f75c5afb-b292-41c0-9e77-b9dc84f38b45"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.000362 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 06 12:27:10 crc kubenswrapper[4892]: E1006 12:27:10.000721 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="sg-core" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.000738 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="sg-core" Oct 06 12:27:10 crc kubenswrapper[4892]: E1006 12:27:10.000753 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="ceilometer-central-agent" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.000760 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="ceilometer-central-agent" Oct 06 12:27:10 crc kubenswrapper[4892]: E1006 12:27:10.000769 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="proxy-httpd" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.000775 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="proxy-httpd" Oct 06 12:27:10 crc kubenswrapper[4892]: E1006 12:27:10.000791 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="ceilometer-notification-agent" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.000798 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="ceilometer-notification-agent" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.000978 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="proxy-httpd" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.000999 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="sg-core" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.001019 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="ceilometer-central-agent" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.001030 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" containerName="ceilometer-notification-agent" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.001779 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.004796 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f75c5afb-b292-41c0-9e77-b9dc84f38b45" (UID: "f75c5afb-b292-41c0-9e77-b9dc84f38b45"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.008379 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f75c5afb-b292-41c0-9e77-b9dc84f38b45-kube-api-access-kh2hz" (OuterVolumeSpecName: "kube-api-access-kh2hz") pod "f75c5afb-b292-41c0-9e77-b9dc84f38b45" (UID: "f75c5afb-b292-41c0-9e77-b9dc84f38b45"). InnerVolumeSpecName "kube-api-access-kh2hz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.010955 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-scripts" (OuterVolumeSpecName: "scripts") pod "f75c5afb-b292-41c0-9e77-b9dc84f38b45" (UID: "f75c5afb-b292-41c0-9e77-b9dc84f38b45"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.036369 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.090710 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f75c5afb-b292-41c0-9e77-b9dc84f38b45" (UID: "f75c5afb-b292-41c0-9e77-b9dc84f38b45"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.099448 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/88899d7e-63bd-4092-a2e5-81974383d714-openstack-config-secret\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.099523 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/88899d7e-63bd-4092-a2e5-81974383d714-openstack-config\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.099571 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88899d7e-63bd-4092-a2e5-81974383d714-combined-ca-bundle\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.099598 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxsj4\" (UniqueName: \"kubernetes.io/projected/88899d7e-63bd-4092-a2e5-81974383d714-kube-api-access-wxsj4\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.099957 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.099979 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.099993 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh2hz\" (UniqueName: \"kubernetes.io/projected/f75c5afb-b292-41c0-9e77-b9dc84f38b45-kube-api-access-kh2hz\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.100008 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.100020 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f75c5afb-b292-41c0-9e77-b9dc84f38b45-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: E1006 12:27:10.116516 4892 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 06 12:27:10 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_5d87697a-3512-4f25-8a43-e3267c566051_0(5f6bc9c54ce17d503f28d8e70242d9e362e7504529b44847c59623ee50af469f): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5f6bc9c54ce17d503f28d8e70242d9e362e7504529b44847c59623ee50af469f" Netns:"/var/run/netns/343fd3cc-1ec7-4ee1-835f-e39d78eac030" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=5f6bc9c54ce17d503f28d8e70242d9e362e7504529b44847c59623ee50af469f;K8S_POD_UID=5d87697a-3512-4f25-8a43-e3267c566051" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/5d87697a-3512-4f25-8a43-e3267c566051]: expected pod UID "5d87697a-3512-4f25-8a43-e3267c566051" but got "88899d7e-63bd-4092-a2e5-81974383d714" from Kube API Oct 06 12:27:10 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 06 12:27:10 crc kubenswrapper[4892]: > Oct 06 12:27:10 crc kubenswrapper[4892]: E1006 12:27:10.116572 4892 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 06 12:27:10 crc kubenswrapper[4892]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_5d87697a-3512-4f25-8a43-e3267c566051_0(5f6bc9c54ce17d503f28d8e70242d9e362e7504529b44847c59623ee50af469f): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5f6bc9c54ce17d503f28d8e70242d9e362e7504529b44847c59623ee50af469f" Netns:"/var/run/netns/343fd3cc-1ec7-4ee1-835f-e39d78eac030" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=5f6bc9c54ce17d503f28d8e70242d9e362e7504529b44847c59623ee50af469f;K8S_POD_UID=5d87697a-3512-4f25-8a43-e3267c566051" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/5d87697a-3512-4f25-8a43-e3267c566051]: expected pod UID "5d87697a-3512-4f25-8a43-e3267c566051" but got "88899d7e-63bd-4092-a2e5-81974383d714" from Kube API Oct 06 12:27:10 crc kubenswrapper[4892]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 06 12:27:10 crc kubenswrapper[4892]: > pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.136516 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f75c5afb-b292-41c0-9e77-b9dc84f38b45" (UID: "f75c5afb-b292-41c0-9e77-b9dc84f38b45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.160305 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-config-data" (OuterVolumeSpecName: "config-data") pod "f75c5afb-b292-41c0-9e77-b9dc84f38b45" (UID: "f75c5afb-b292-41c0-9e77-b9dc84f38b45"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.202503 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/88899d7e-63bd-4092-a2e5-81974383d714-openstack-config-secret\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.202544 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/88899d7e-63bd-4092-a2e5-81974383d714-openstack-config\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.205109 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88899d7e-63bd-4092-a2e5-81974383d714-combined-ca-bundle\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.205130 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxsj4\" (UniqueName: \"kubernetes.io/projected/88899d7e-63bd-4092-a2e5-81974383d714-kube-api-access-wxsj4\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.205058 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/88899d7e-63bd-4092-a2e5-81974383d714-openstack-config\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.205531 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.205550 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75c5afb-b292-41c0-9e77-b9dc84f38b45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.206790 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/88899d7e-63bd-4092-a2e5-81974383d714-openstack-config-secret\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.209059 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88899d7e-63bd-4092-a2e5-81974383d714-combined-ca-bundle\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.226349 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxsj4\" (UniqueName: \"kubernetes.io/projected/88899d7e-63bd-4092-a2e5-81974383d714-kube-api-access-wxsj4\") pod \"openstackclient\" (UID: \"88899d7e-63bd-4092-a2e5-81974383d714\") " pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.341255 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.434920 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.435789 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b4k8\" (UniqueName: \"kubernetes.io/projected/0e60546e-29f5-4c31-9124-1ee0f28340e8-kube-api-access-9b4k8\") pod \"0e60546e-29f5-4c31-9124-1ee0f28340e8\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.435854 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data-custom\") pod \"0e60546e-29f5-4c31-9124-1ee0f28340e8\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.436039 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e60546e-29f5-4c31-9124-1ee0f28340e8-logs\") pod \"0e60546e-29f5-4c31-9124-1ee0f28340e8\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.436094 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-combined-ca-bundle\") pod \"0e60546e-29f5-4c31-9124-1ee0f28340e8\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.436126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data\") pod \"0e60546e-29f5-4c31-9124-1ee0f28340e8\" (UID: \"0e60546e-29f5-4c31-9124-1ee0f28340e8\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.442002 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0e60546e-29f5-4c31-9124-1ee0f28340e8" (UID: "0e60546e-29f5-4c31-9124-1ee0f28340e8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.443301 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e60546e-29f5-4c31-9124-1ee0f28340e8-logs" (OuterVolumeSpecName: "logs") pod "0e60546e-29f5-4c31-9124-1ee0f28340e8" (UID: "0e60546e-29f5-4c31-9124-1ee0f28340e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.446302 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e60546e-29f5-4c31-9124-1ee0f28340e8-kube-api-access-9b4k8" (OuterVolumeSpecName: "kube-api-access-9b4k8") pod "0e60546e-29f5-4c31-9124-1ee0f28340e8" (UID: "0e60546e-29f5-4c31-9124-1ee0f28340e8"). InnerVolumeSpecName "kube-api-access-9b4k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.502106 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e60546e-29f5-4c31-9124-1ee0f28340e8" (UID: "0e60546e-29f5-4c31-9124-1ee0f28340e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.538551 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e60546e-29f5-4c31-9124-1ee0f28340e8-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.538579 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.538589 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b4k8\" (UniqueName: \"kubernetes.io/projected/0e60546e-29f5-4c31-9124-1ee0f28340e8-kube-api-access-9b4k8\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.538601 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.540051 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data" (OuterVolumeSpecName: "config-data") pod "0e60546e-29f5-4c31-9124-1ee0f28340e8" (UID: "0e60546e-29f5-4c31-9124-1ee0f28340e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.642435 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e60546e-29f5-4c31-9124-1ee0f28340e8-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.720065 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f75c5afb-b292-41c0-9e77-b9dc84f38b45","Type":"ContainerDied","Data":"cc6f318f1acab0f2ec431777c072253e2a33aa2e5901305138462e824cee1460"} Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.720113 4892 scope.go:117] "RemoveContainer" containerID="f104535ba92db431ddc5ae339990218e53641b7b04d8b3aa3f16e3a78a526eeb" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.720243 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.723024 4892 generic.go:334] "Generic (PLEG): container finished" podID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerID="1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e" exitCode=0 Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.723095 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.723239 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657b8d7546-4g6v6" event={"ID":"0e60546e-29f5-4c31-9124-1ee0f28340e8","Type":"ContainerDied","Data":"1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e"} Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.723263 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-657b8d7546-4g6v6" event={"ID":"0e60546e-29f5-4c31-9124-1ee0f28340e8","Type":"ContainerDied","Data":"f4cc11332c8a54bb3076da3365934c84efc90102ea95e556ea3fa7cb3182bc83"} Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.723275 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-657b8d7546-4g6v6" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.743203 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.748554 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.753221 4892 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5d87697a-3512-4f25-8a43-e3267c566051" podUID="88899d7e-63bd-4092-a2e5-81974383d714" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.773697 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.798969 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:10 crc kubenswrapper[4892]: E1006 12:27:10.799403 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api-log" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.799422 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api-log" Oct 06 12:27:10 crc kubenswrapper[4892]: E1006 12:27:10.799435 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.799441 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.799617 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.799636 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" containerName="barbican-api-log" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.801358 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.807780 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.808102 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.818540 4892 scope.go:117] "RemoveContainer" containerID="67eb75004ff0daa385a53bd94c782def7df97e07fa88afcdc0b6ded5ebf7627a" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.819019 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-657b8d7546-4g6v6"] Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.828240 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-657b8d7546-4g6v6"] Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.837148 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.842680 4892 scope.go:117] "RemoveContainer" containerID="2e0bc68cd17eb8cdf12f45ea8fa27aaae9d3ddd95d93952eb9f3534f362600b0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.845872 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config\") pod \"5d87697a-3512-4f25-8a43-e3267c566051\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.845959 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-combined-ca-bundle\") pod \"5d87697a-3512-4f25-8a43-e3267c566051\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.846126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config-secret\") pod \"5d87697a-3512-4f25-8a43-e3267c566051\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.846189 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs879\" (UniqueName: \"kubernetes.io/projected/5d87697a-3512-4f25-8a43-e3267c566051-kube-api-access-gs879\") pod \"5d87697a-3512-4f25-8a43-e3267c566051\" (UID: \"5d87697a-3512-4f25-8a43-e3267c566051\") " Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.846425 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5d87697a-3512-4f25-8a43-e3267c566051" (UID: "5d87697a-3512-4f25-8a43-e3267c566051"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.846732 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.851448 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5d87697a-3512-4f25-8a43-e3267c566051" (UID: "5d87697a-3512-4f25-8a43-e3267c566051"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.851787 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d87697a-3512-4f25-8a43-e3267c566051-kube-api-access-gs879" (OuterVolumeSpecName: "kube-api-access-gs879") pod "5d87697a-3512-4f25-8a43-e3267c566051" (UID: "5d87697a-3512-4f25-8a43-e3267c566051"). InnerVolumeSpecName "kube-api-access-gs879". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.854076 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d87697a-3512-4f25-8a43-e3267c566051" (UID: "5d87697a-3512-4f25-8a43-e3267c566051"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.860382 4892 scope.go:117] "RemoveContainer" containerID="e6539d1e7544ee29dc9c8e9e92bb8c80e741283532258168d0521522155d4d52" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.915583 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 06 12:27:10 crc kubenswrapper[4892]: W1006 12:27:10.921495 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88899d7e_63bd_4092_a2e5_81974383d714.slice/crio-ae904f9d5e01a9bc7c32ebf5e4a12c08b2fcb19676167445a4083c41dd0eee8c WatchSource:0}: Error finding container ae904f9d5e01a9bc7c32ebf5e4a12c08b2fcb19676167445a4083c41dd0eee8c: Status 404 returned error can't find the container with id ae904f9d5e01a9bc7c32ebf5e4a12c08b2fcb19676167445a4083c41dd0eee8c Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948021 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-scripts\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948090 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948432 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-log-httpd\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948494 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58lzq\" (UniqueName: \"kubernetes.io/projected/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-kube-api-access-58lzq\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948654 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-config-data\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948771 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-run-httpd\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948830 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948890 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948908 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs879\" (UniqueName: \"kubernetes.io/projected/5d87697a-3512-4f25-8a43-e3267c566051-kube-api-access-gs879\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.948917 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d87697a-3512-4f25-8a43-e3267c566051-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:10 crc kubenswrapper[4892]: I1006 12:27:10.986749 4892 scope.go:117] "RemoveContainer" containerID="1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.024606 4892 scope.go:117] "RemoveContainer" containerID="49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.045936 4892 scope.go:117] "RemoveContainer" containerID="1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e" Oct 06 12:27:11 crc kubenswrapper[4892]: E1006 12:27:11.048565 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e\": container with ID starting with 1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e not found: ID does not exist" containerID="1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.048605 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e"} err="failed to get container status \"1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e\": rpc error: code = NotFound desc = could not find container \"1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e\": container with ID starting with 1488df85152194bac3fee669671d39e5f172ffad51e1451fee474454e5224f5e not found: ID does not exist" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.048633 4892 scope.go:117] "RemoveContainer" containerID="49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2" Oct 06 12:27:11 crc kubenswrapper[4892]: E1006 12:27:11.049056 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2\": container with ID starting with 49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2 not found: ID does not exist" containerID="49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.049121 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2"} err="failed to get container status \"49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2\": rpc error: code = NotFound desc = could not find container \"49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2\": container with ID starting with 49480cfbaa68be5fa21fd300977962f34c0b30124b8bd7f40d3d6141afc626a2 not found: ID does not exist" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.050013 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-log-httpd\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.050054 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58lzq\" (UniqueName: \"kubernetes.io/projected/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-kube-api-access-58lzq\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.050115 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-config-data\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.050172 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-run-httpd\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.050202 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.050234 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-scripts\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.050287 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.051633 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-run-httpd\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.051673 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-log-httpd\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.054302 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.054908 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-scripts\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.057284 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-config-data\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.057639 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.071476 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58lzq\" (UniqueName: \"kubernetes.io/projected/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-kube-api-access-58lzq\") pod \"ceilometer-0\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.122077 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.622555 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:11 crc kubenswrapper[4892]: W1006 12:27:11.626232 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b756ba9_5c1a_44a1_8d2f_57fa3a9f0e97.slice/crio-44e8d700f5b9cb276058246c21191adf21d99f88ab72250f73bb0cfd370c8899 WatchSource:0}: Error finding container 44e8d700f5b9cb276058246c21191adf21d99f88ab72250f73bb0cfd370c8899: Status 404 returned error can't find the container with id 44e8d700f5b9cb276058246c21191adf21d99f88ab72250f73bb0cfd370c8899 Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.738891 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"88899d7e-63bd-4092-a2e5-81974383d714","Type":"ContainerStarted","Data":"ae904f9d5e01a9bc7c32ebf5e4a12c08b2fcb19676167445a4083c41dd0eee8c"} Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.755351 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerStarted","Data":"44e8d700f5b9cb276058246c21191adf21d99f88ab72250f73bb0cfd370c8899"} Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.764572 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 06 12:27:11 crc kubenswrapper[4892]: I1006 12:27:11.858098 4892 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="5d87697a-3512-4f25-8a43-e3267c566051" podUID="88899d7e-63bd-4092-a2e5-81974383d714" Oct 06 12:27:12 crc kubenswrapper[4892]: I1006 12:27:12.186419 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e60546e-29f5-4c31-9124-1ee0f28340e8" path="/var/lib/kubelet/pods/0e60546e-29f5-4c31-9124-1ee0f28340e8/volumes" Oct 06 12:27:12 crc kubenswrapper[4892]: I1006 12:27:12.187396 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d87697a-3512-4f25-8a43-e3267c566051" path="/var/lib/kubelet/pods/5d87697a-3512-4f25-8a43-e3267c566051/volumes" Oct 06 12:27:12 crc kubenswrapper[4892]: I1006 12:27:12.187771 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f75c5afb-b292-41c0-9e77-b9dc84f38b45" path="/var/lib/kubelet/pods/f75c5afb-b292-41c0-9e77-b9dc84f38b45/volumes" Oct 06 12:27:12 crc kubenswrapper[4892]: I1006 12:27:12.774333 4892 generic.go:334] "Generic (PLEG): container finished" podID="eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" containerID="94ffbba9af6b2f85ea4ab925ef340ab9848563be2d1fd8d14327157208878515" exitCode=0 Oct 06 12:27:12 crc kubenswrapper[4892]: I1006 12:27:12.774446 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rk9bz" event={"ID":"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea","Type":"ContainerDied","Data":"94ffbba9af6b2f85ea4ab925ef340ab9848563be2d1fd8d14327157208878515"} Oct 06 12:27:12 crc kubenswrapper[4892]: I1006 12:27:12.777894 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerStarted","Data":"7cd572b4c48e2d00e0bf40fa35179c82494a8dd0b0e406e69152b90ae4fedac2"} Oct 06 12:27:12 crc kubenswrapper[4892]: I1006 12:27:12.778051 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerStarted","Data":"1c98c68346c852c6b86bbbb1a4f325d28e904786eaea2e6661104c316db0a332"} Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.530145 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-57bbd8d677-4mwpb"] Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.531968 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.534370 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.534488 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.538809 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.545484 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-57bbd8d677-4mwpb"] Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.725191 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-config-data\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.726297 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-public-tls-certs\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.726359 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9b16ec0c-fdde-42a8-9a45-da67ecd56360-etc-swift\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.726397 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-internal-tls-certs\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.726437 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b16ec0c-fdde-42a8-9a45-da67ecd56360-log-httpd\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.726458 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tfz\" (UniqueName: \"kubernetes.io/projected/9b16ec0c-fdde-42a8-9a45-da67ecd56360-kube-api-access-f7tfz\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.726498 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-combined-ca-bundle\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.726560 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b16ec0c-fdde-42a8-9a45-da67ecd56360-run-httpd\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.803765 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerStarted","Data":"8c3ceec821577c456237a2daa39c43f10ad54a97c80401a08f6f523dacad3ada"} Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.827480 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-config-data\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.827527 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-public-tls-certs\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.827546 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9b16ec0c-fdde-42a8-9a45-da67ecd56360-etc-swift\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.827565 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-internal-tls-certs\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.827591 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b16ec0c-fdde-42a8-9a45-da67ecd56360-log-httpd\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.827606 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tfz\" (UniqueName: \"kubernetes.io/projected/9b16ec0c-fdde-42a8-9a45-da67ecd56360-kube-api-access-f7tfz\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.827640 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-combined-ca-bundle\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.827677 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b16ec0c-fdde-42a8-9a45-da67ecd56360-run-httpd\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.828147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b16ec0c-fdde-42a8-9a45-da67ecd56360-run-httpd\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.828388 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b16ec0c-fdde-42a8-9a45-da67ecd56360-log-httpd\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.837394 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-internal-tls-certs\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.837421 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-public-tls-certs\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.838347 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/9b16ec0c-fdde-42a8-9a45-da67ecd56360-etc-swift\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.839121 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-config-data\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.841356 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b16ec0c-fdde-42a8-9a45-da67ecd56360-combined-ca-bundle\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:13 crc kubenswrapper[4892]: I1006 12:27:13.857117 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tfz\" (UniqueName: \"kubernetes.io/projected/9b16ec0c-fdde-42a8-9a45-da67ecd56360-kube-api-access-f7tfz\") pod \"swift-proxy-57bbd8d677-4mwpb\" (UID: \"9b16ec0c-fdde-42a8-9a45-da67ecd56360\") " pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.147035 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.186427 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.268210 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.447496 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-db-sync-config-data\") pod \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.447562 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-config-data\") pod \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.447617 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-etc-machine-id\") pod \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.447678 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-scripts\") pod \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.447815 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-combined-ca-bundle\") pod \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.447863 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px9bw\" (UniqueName: \"kubernetes.io/projected/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-kube-api-access-px9bw\") pod \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\" (UID: \"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea\") " Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.448510 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" (UID: "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.449056 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.452333 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-kube-api-access-px9bw" (OuterVolumeSpecName: "kube-api-access-px9bw") pod "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" (UID: "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea"). InnerVolumeSpecName "kube-api-access-px9bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.453165 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-scripts" (OuterVolumeSpecName: "scripts") pod "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" (UID: "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.458777 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" (UID: "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.477242 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" (UID: "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.499198 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-config-data" (OuterVolumeSpecName: "config-data") pod "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" (UID: "eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.550436 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.550488 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px9bw\" (UniqueName: \"kubernetes.io/projected/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-kube-api-access-px9bw\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.550500 4892 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.550510 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.550519 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.726049 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-57bbd8d677-4mwpb"] Oct 06 12:27:14 crc kubenswrapper[4892]: W1006 12:27:14.728275 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b16ec0c_fdde_42a8_9a45_da67ecd56360.slice/crio-0ef6aee0255a447810d2ab9b3b052ce7b80385d251282db7842f9145a4cf29a3 WatchSource:0}: Error finding container 0ef6aee0255a447810d2ab9b3b052ce7b80385d251282db7842f9145a4cf29a3: Status 404 returned error can't find the container with id 0ef6aee0255a447810d2ab9b3b052ce7b80385d251282db7842f9145a4cf29a3 Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.818429 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57bbd8d677-4mwpb" event={"ID":"9b16ec0c-fdde-42a8-9a45-da67ecd56360","Type":"ContainerStarted","Data":"0ef6aee0255a447810d2ab9b3b052ce7b80385d251282db7842f9145a4cf29a3"} Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.823984 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rk9bz" event={"ID":"eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea","Type":"ContainerDied","Data":"05adee9262e8f58651ca92a0eb84ae20cd9b86888ab99cc52d05448d283d65ab"} Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.824020 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05adee9262e8f58651ca92a0eb84ae20cd9b86888ab99cc52d05448d283d65ab" Oct 06 12:27:14 crc kubenswrapper[4892]: I1006 12:27:14.824073 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rk9bz" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.063363 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:15 crc kubenswrapper[4892]: E1006 12:27:15.063732 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" containerName="cinder-db-sync" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.063751 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" containerName="cinder-db-sync" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.063941 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" containerName="cinder-db-sync" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.065036 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.071180 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.076710 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-ndb9h" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.076745 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.076985 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.115025 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.162632 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2xqj\" (UniqueName: \"kubernetes.io/projected/6950e2ad-e715-42cf-ae66-ac50bc683bf1-kube-api-access-p2xqj\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.162679 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.162705 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.162782 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-scripts\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.162861 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.162889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6950e2ad-e715-42cf-ae66-ac50bc683bf1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.173078 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cddcc8d5f-5nwpd"] Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.174628 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.205476 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cddcc8d5f-5nwpd"] Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.237717 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.241895 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.247474 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.264681 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data-custom\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.266238 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-svc\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.266363 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-swift-storage-0\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.266486 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/455b9aa7-eeb4-40aa-ae93-872b577730d4-logs\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.266552 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.266643 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.266738 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6950e2ad-e715-42cf-ae66-ac50bc683bf1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.266811 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-config\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.266923 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2xqj\" (UniqueName: \"kubernetes.io/projected/6950e2ad-e715-42cf-ae66-ac50bc683bf1-kube-api-access-p2xqj\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.267867 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6rlz\" (UniqueName: \"kubernetes.io/projected/2abc9761-3bef-408b-ab0e-2947fc29b250-kube-api-access-q6rlz\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.267938 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.268030 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.268613 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-nb\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.268776 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-scripts\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.268804 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6950e2ad-e715-42cf-ae66-ac50bc683bf1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.268811 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-sb\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.268887 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.268914 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbl25\" (UniqueName: \"kubernetes.io/projected/455b9aa7-eeb4-40aa-ae93-872b577730d4-kube-api-access-cbl25\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.269129 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/455b9aa7-eeb4-40aa-ae93-872b577730d4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.269154 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-scripts\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.270208 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.273926 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.278811 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.279610 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-scripts\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.289097 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.290951 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2xqj\" (UniqueName: \"kubernetes.io/projected/6950e2ad-e715-42cf-ae66-ac50bc683bf1-kube-api-access-p2xqj\") pod \"cinder-scheduler-0\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.370920 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data-custom\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.370960 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-svc\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.370983 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-swift-storage-0\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371031 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/455b9aa7-eeb4-40aa-ae93-872b577730d4-logs\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371121 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-config\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371161 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6rlz\" (UniqueName: \"kubernetes.io/projected/2abc9761-3bef-408b-ab0e-2947fc29b250-kube-api-access-q6rlz\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371207 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-nb\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371249 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-sb\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371273 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbl25\" (UniqueName: \"kubernetes.io/projected/455b9aa7-eeb4-40aa-ae93-872b577730d4-kube-api-access-cbl25\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371345 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/455b9aa7-eeb4-40aa-ae93-872b577730d4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.371360 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-scripts\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.372649 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-swift-storage-0\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.372850 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-config\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.373594 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-sb\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.374496 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/455b9aa7-eeb4-40aa-ae93-872b577730d4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.374717 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-nb\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.375030 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/455b9aa7-eeb4-40aa-ae93-872b577730d4-logs\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.376374 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-svc\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.376726 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-scripts\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.381292 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data-custom\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.383406 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.384243 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.394820 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbl25\" (UniqueName: \"kubernetes.io/projected/455b9aa7-eeb4-40aa-ae93-872b577730d4-kube-api-access-cbl25\") pod \"cinder-api-0\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.398273 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6rlz\" (UniqueName: \"kubernetes.io/projected/2abc9761-3bef-408b-ab0e-2947fc29b250-kube-api-access-q6rlz\") pod \"dnsmasq-dns-cddcc8d5f-5nwpd\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.398820 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.622813 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.627429 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.860435 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerStarted","Data":"bac6a10f74e4f31f6a49a260d7929e8c700bd34dbb90a6e4c7e3b2c4b43bedc3"} Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.860596 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="ceilometer-central-agent" containerID="cri-o://1c98c68346c852c6b86bbbb1a4f325d28e904786eaea2e6661104c316db0a332" gracePeriod=30 Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.861104 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.861219 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="proxy-httpd" containerID="cri-o://bac6a10f74e4f31f6a49a260d7929e8c700bd34dbb90a6e4c7e3b2c4b43bedc3" gracePeriod=30 Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.861302 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="ceilometer-notification-agent" containerID="cri-o://7cd572b4c48e2d00e0bf40fa35179c82494a8dd0b0e406e69152b90ae4fedac2" gracePeriod=30 Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.861357 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="sg-core" containerID="cri-o://8c3ceec821577c456237a2daa39c43f10ad54a97c80401a08f6f523dacad3ada" gracePeriod=30 Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.866259 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57bbd8d677-4mwpb" event={"ID":"9b16ec0c-fdde-42a8-9a45-da67ecd56360","Type":"ContainerStarted","Data":"7418fcdb6ff9276f377bf99649f14b2aa2e902c3d8d6688b248d1e133f6062bc"} Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.866291 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-57bbd8d677-4mwpb" event={"ID":"9b16ec0c-fdde-42a8-9a45-da67ecd56360","Type":"ContainerStarted","Data":"8d8d5905705731aebb63706ee02f19e15610b7b05b2128e3eda992fa6b467b1d"} Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.866437 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.866457 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.893300 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.707948839 podStartE2EDuration="5.893283055s" podCreationTimestamp="2025-10-06 12:27:10 +0000 UTC" firstStartedPulling="2025-10-06 12:27:11.628802148 +0000 UTC m=+1118.178507913" lastFinishedPulling="2025-10-06 12:27:14.814136364 +0000 UTC m=+1121.363842129" observedRunningTime="2025-10-06 12:27:15.880376289 +0000 UTC m=+1122.430082054" watchObservedRunningTime="2025-10-06 12:27:15.893283055 +0000 UTC m=+1122.442988820" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.926771 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-57bbd8d677-4mwpb" podStartSLOduration=2.9267520769999997 podStartE2EDuration="2.926752077s" podCreationTimestamp="2025-10-06 12:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:15.914543543 +0000 UTC m=+1122.464249308" watchObservedRunningTime="2025-10-06 12:27:15.926752077 +0000 UTC m=+1122.476457843" Oct 06 12:27:15 crc kubenswrapper[4892]: I1006 12:27:15.957764 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.269180 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cddcc8d5f-5nwpd"] Oct 06 12:27:16 crc kubenswrapper[4892]: W1006 12:27:16.312516 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2abc9761_3bef_408b_ab0e_2947fc29b250.slice/crio-044cc95100493c8ba2c49c7bdb2c77d91af132eeffbdc30c716379b6b0f12006 WatchSource:0}: Error finding container 044cc95100493c8ba2c49c7bdb2c77d91af132eeffbdc30c716379b6b0f12006: Status 404 returned error can't find the container with id 044cc95100493c8ba2c49c7bdb2c77d91af132eeffbdc30c716379b6b0f12006 Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.392615 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.774506 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7489d9984-82d5x" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.167:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.167:8443: connect: connection refused" Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.774598 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.909949 4892 generic.go:334] "Generic (PLEG): container finished" podID="2abc9761-3bef-408b-ab0e-2947fc29b250" containerID="7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c" exitCode=0 Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.910261 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" event={"ID":"2abc9761-3bef-408b-ab0e-2947fc29b250","Type":"ContainerDied","Data":"7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c"} Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.910289 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" event={"ID":"2abc9761-3bef-408b-ab0e-2947fc29b250","Type":"ContainerStarted","Data":"044cc95100493c8ba2c49c7bdb2c77d91af132eeffbdc30c716379b6b0f12006"} Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.936706 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6950e2ad-e715-42cf-ae66-ac50bc683bf1","Type":"ContainerStarted","Data":"daad2bb6da6e0494a1a0e813b1a3a15341ee3e95eedaf56a9439ba5a13abdb30"} Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.986759 4892 generic.go:334] "Generic (PLEG): container finished" podID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerID="bac6a10f74e4f31f6a49a260d7929e8c700bd34dbb90a6e4c7e3b2c4b43bedc3" exitCode=0 Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.986792 4892 generic.go:334] "Generic (PLEG): container finished" podID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerID="8c3ceec821577c456237a2daa39c43f10ad54a97c80401a08f6f523dacad3ada" exitCode=2 Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.986801 4892 generic.go:334] "Generic (PLEG): container finished" podID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerID="7cd572b4c48e2d00e0bf40fa35179c82494a8dd0b0e406e69152b90ae4fedac2" exitCode=0 Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.986873 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerDied","Data":"bac6a10f74e4f31f6a49a260d7929e8c700bd34dbb90a6e4c7e3b2c4b43bedc3"} Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.986901 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerDied","Data":"8c3ceec821577c456237a2daa39c43f10ad54a97c80401a08f6f523dacad3ada"} Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.986911 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerDied","Data":"7cd572b4c48e2d00e0bf40fa35179c82494a8dd0b0e406e69152b90ae4fedac2"} Oct 06 12:27:16 crc kubenswrapper[4892]: I1006 12:27:16.999640 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"455b9aa7-eeb4-40aa-ae93-872b577730d4","Type":"ContainerStarted","Data":"3fafb9db75ef842495305bcb5b017578f284048fd9effb8db5394d2e5b4c19bd"} Oct 06 12:27:17 crc kubenswrapper[4892]: I1006 12:27:17.619063 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:18 crc kubenswrapper[4892]: I1006 12:27:18.012225 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" event={"ID":"2abc9761-3bef-408b-ab0e-2947fc29b250","Type":"ContainerStarted","Data":"ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b"} Oct 06 12:27:18 crc kubenswrapper[4892]: I1006 12:27:18.012530 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:18 crc kubenswrapper[4892]: I1006 12:27:18.014805 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6950e2ad-e715-42cf-ae66-ac50bc683bf1","Type":"ContainerStarted","Data":"9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8"} Oct 06 12:27:18 crc kubenswrapper[4892]: I1006 12:27:18.018209 4892 generic.go:334] "Generic (PLEG): container finished" podID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerID="1c98c68346c852c6b86bbbb1a4f325d28e904786eaea2e6661104c316db0a332" exitCode=0 Oct 06 12:27:18 crc kubenswrapper[4892]: I1006 12:27:18.018255 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerDied","Data":"1c98c68346c852c6b86bbbb1a4f325d28e904786eaea2e6661104c316db0a332"} Oct 06 12:27:18 crc kubenswrapper[4892]: I1006 12:27:18.019861 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"455b9aa7-eeb4-40aa-ae93-872b577730d4","Type":"ContainerStarted","Data":"1142e5a9f6af62beb67e814f2f19a17657ea6b831ae2cadca200c1e14e6d7943"} Oct 06 12:27:18 crc kubenswrapper[4892]: I1006 12:27:18.034290 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" podStartSLOduration=3.034273922 podStartE2EDuration="3.034273922s" podCreationTimestamp="2025-10-06 12:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:18.030333027 +0000 UTC m=+1124.580038812" watchObservedRunningTime="2025-10-06 12:27:18.034273922 +0000 UTC m=+1124.583979687" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.185068 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-g72j5"] Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.186588 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-g72j5" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.197500 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-g72j5"] Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.276216 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vvk9\" (UniqueName: \"kubernetes.io/projected/be99d63c-4bdd-4dae-a003-5215816244ac-kube-api-access-9vvk9\") pod \"nova-api-db-create-g72j5\" (UID: \"be99d63c-4bdd-4dae-a003-5215816244ac\") " pod="openstack/nova-api-db-create-g72j5" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.283148 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-q7svw"] Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.284554 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q7svw" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.290172 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-q7svw"] Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.378440 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vvk9\" (UniqueName: \"kubernetes.io/projected/be99d63c-4bdd-4dae-a003-5215816244ac-kube-api-access-9vvk9\") pod \"nova-api-db-create-g72j5\" (UID: \"be99d63c-4bdd-4dae-a003-5215816244ac\") " pod="openstack/nova-api-db-create-g72j5" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.378717 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rkz2\" (UniqueName: \"kubernetes.io/projected/367eb665-c929-4ea9-8fb5-cd23cd430278-kube-api-access-9rkz2\") pod \"nova-cell0-db-create-q7svw\" (UID: \"367eb665-c929-4ea9-8fb5-cd23cd430278\") " pod="openstack/nova-cell0-db-create-q7svw" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.393116 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hbr5f"] Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.394515 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hbr5f" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.399297 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vvk9\" (UniqueName: \"kubernetes.io/projected/be99d63c-4bdd-4dae-a003-5215816244ac-kube-api-access-9vvk9\") pod \"nova-api-db-create-g72j5\" (UID: \"be99d63c-4bdd-4dae-a003-5215816244ac\") " pod="openstack/nova-api-db-create-g72j5" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.399630 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hbr5f"] Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.480486 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9ld7\" (UniqueName: \"kubernetes.io/projected/b63c64d6-eb6d-4370-abd8-b4eb8487ae8a-kube-api-access-q9ld7\") pod \"nova-cell1-db-create-hbr5f\" (UID: \"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a\") " pod="openstack/nova-cell1-db-create-hbr5f" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.480695 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rkz2\" (UniqueName: \"kubernetes.io/projected/367eb665-c929-4ea9-8fb5-cd23cd430278-kube-api-access-9rkz2\") pod \"nova-cell0-db-create-q7svw\" (UID: \"367eb665-c929-4ea9-8fb5-cd23cd430278\") " pod="openstack/nova-cell0-db-create-q7svw" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.495042 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rkz2\" (UniqueName: \"kubernetes.io/projected/367eb665-c929-4ea9-8fb5-cd23cd430278-kube-api-access-9rkz2\") pod \"nova-cell0-db-create-q7svw\" (UID: \"367eb665-c929-4ea9-8fb5-cd23cd430278\") " pod="openstack/nova-cell0-db-create-q7svw" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.515516 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-g72j5" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.581935 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9ld7\" (UniqueName: \"kubernetes.io/projected/b63c64d6-eb6d-4370-abd8-b4eb8487ae8a-kube-api-access-q9ld7\") pod \"nova-cell1-db-create-hbr5f\" (UID: \"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a\") " pod="openstack/nova-cell1-db-create-hbr5f" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.599198 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9ld7\" (UniqueName: \"kubernetes.io/projected/b63c64d6-eb6d-4370-abd8-b4eb8487ae8a-kube-api-access-q9ld7\") pod \"nova-cell1-db-create-hbr5f\" (UID: \"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a\") " pod="openstack/nova-cell1-db-create-hbr5f" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.654970 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q7svw" Oct 06 12:27:20 crc kubenswrapper[4892]: I1006 12:27:20.763018 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hbr5f" Oct 06 12:27:21 crc kubenswrapper[4892]: I1006 12:27:21.091480 4892 generic.go:334] "Generic (PLEG): container finished" podID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerID="b929f09ed20951199fcf75f8cafe3036227c5cbc150094f8ea1bac9f3e6ae072" exitCode=137 Oct 06 12:27:21 crc kubenswrapper[4892]: I1006 12:27:21.091713 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7489d9984-82d5x" event={"ID":"b229a3b8-5243-4ec5-8970-d69b61553a4b","Type":"ContainerDied","Data":"b929f09ed20951199fcf75f8cafe3036227c5cbc150094f8ea1bac9f3e6ae072"} Oct 06 12:27:21 crc kubenswrapper[4892]: I1006 12:27:21.168790 4892 scope.go:117] "RemoveContainer" containerID="c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab" Oct 06 12:27:21 crc kubenswrapper[4892]: I1006 12:27:21.759291 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:27:21 crc kubenswrapper[4892]: I1006 12:27:21.759697 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" containerName="glance-httpd" containerID="cri-o://5d81adafc0eea76a3209fb496c1db4db2242eb936080e4c7e6cf5a46abf4fbc2" gracePeriod=30 Oct 06 12:27:21 crc kubenswrapper[4892]: I1006 12:27:21.759649 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" containerName="glance-log" containerID="cri-o://b68b173207667e8ecfc2b1fda1ce77722b03d173575c781ce659aa479a985d2e" gracePeriod=30 Oct 06 12:27:22 crc kubenswrapper[4892]: I1006 12:27:22.107672 4892 generic.go:334] "Generic (PLEG): container finished" podID="ff0622c4-d0fd-4732-9418-58d60d081887" containerID="b68b173207667e8ecfc2b1fda1ce77722b03d173575c781ce659aa479a985d2e" exitCode=143 Oct 06 12:27:22 crc kubenswrapper[4892]: I1006 12:27:22.107730 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff0622c4-d0fd-4732-9418-58d60d081887","Type":"ContainerDied","Data":"b68b173207667e8ecfc2b1fda1ce77722b03d173575c781ce659aa479a985d2e"} Oct 06 12:27:22 crc kubenswrapper[4892]: E1006 12:27:22.693736 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff0622c4_d0fd_4732_9418_58d60d081887.slice/crio-conmon-5d81adafc0eea76a3209fb496c1db4db2242eb936080e4c7e6cf5a46abf4fbc2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff0622c4_d0fd_4732_9418_58d60d081887.slice/crio-5d81adafc0eea76a3209fb496c1db4db2242eb936080e4c7e6cf5a46abf4fbc2.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:27:23 crc kubenswrapper[4892]: I1006 12:27:23.131488 4892 generic.go:334] "Generic (PLEG): container finished" podID="ff0622c4-d0fd-4732-9418-58d60d081887" containerID="5d81adafc0eea76a3209fb496c1db4db2242eb936080e4c7e6cf5a46abf4fbc2" exitCode=0 Oct 06 12:27:23 crc kubenswrapper[4892]: I1006 12:27:23.131536 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff0622c4-d0fd-4732-9418-58d60d081887","Type":"ContainerDied","Data":"5d81adafc0eea76a3209fb496c1db4db2242eb936080e4c7e6cf5a46abf4fbc2"} Oct 06 12:27:23 crc kubenswrapper[4892]: I1006 12:27:23.989722 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.060467 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-log-httpd\") pod \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.060538 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-scripts\") pod \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.060643 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-combined-ca-bundle\") pod \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.060668 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-run-httpd\") pod \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.060701 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58lzq\" (UniqueName: \"kubernetes.io/projected/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-kube-api-access-58lzq\") pod \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.060726 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-sg-core-conf-yaml\") pod \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.060751 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-config-data\") pod \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\" (UID: \"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.065876 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" (UID: "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.066509 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" (UID: "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.071160 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-scripts" (OuterVolumeSpecName: "scripts") pod "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" (UID: "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.084594 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-kube-api-access-58lzq" (OuterVolumeSpecName: "kube-api-access-58lzq") pod "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" (UID: "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97"). InnerVolumeSpecName "kube-api-access-58lzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.160418 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.162162 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58lzq\" (UniqueName: \"kubernetes.io/projected/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-kube-api-access-58lzq\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.162180 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.162188 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.162196 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.205111 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.269528 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.273971 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97","Type":"ContainerDied","Data":"44e8d700f5b9cb276058246c21191adf21d99f88ab72250f73bb0cfd370c8899"} Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.274053 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-57bbd8d677-4mwpb" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.274086 4892 scope.go:117] "RemoveContainer" containerID="bac6a10f74e4f31f6a49a260d7929e8c700bd34dbb90a6e4c7e3b2c4b43bedc3" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.292403 4892 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","poda61c1481-8c3c-490a-97eb-a03156bb7ee5"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort poda61c1481-8c3c-490a-97eb-a03156bb7ee5] : Timed out while waiting for systemd to remove kubepods-besteffort-poda61c1481_8c3c_490a_97eb_a03156bb7ee5.slice" Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.292462 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort poda61c1481-8c3c-490a-97eb-a03156bb7ee5] : unable to destroy cgroup paths for cgroup [kubepods besteffort poda61c1481-8c3c-490a-97eb-a03156bb7ee5] : Timed out while waiting for systemd to remove kubepods-besteffort-poda61c1481_8c3c_490a_97eb_a03156bb7ee5.slice" pod="openstack/watcher-api-0" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.314353 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" (UID: "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.323725 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.325036 4892 scope.go:117] "RemoveContainer" containerID="8c3ceec821577c456237a2daa39c43f10ad54a97c80401a08f6f523dacad3ada" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.357895 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.365791 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-config-data\") pod \"b229a3b8-5243-4ec5-8970-d69b61553a4b\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.365997 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wv5k\" (UniqueName: \"kubernetes.io/projected/b229a3b8-5243-4ec5-8970-d69b61553a4b-kube-api-access-5wv5k\") pod \"b229a3b8-5243-4ec5-8970-d69b61553a4b\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.366205 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-combined-ca-bundle\") pod \"b229a3b8-5243-4ec5-8970-d69b61553a4b\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.366425 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b229a3b8-5243-4ec5-8970-d69b61553a4b-logs\") pod \"b229a3b8-5243-4ec5-8970-d69b61553a4b\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.366584 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-tls-certs\") pod \"b229a3b8-5243-4ec5-8970-d69b61553a4b\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.366705 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-scripts\") pod \"b229a3b8-5243-4ec5-8970-d69b61553a4b\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.366919 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-secret-key\") pod \"b229a3b8-5243-4ec5-8970-d69b61553a4b\" (UID: \"b229a3b8-5243-4ec5-8970-d69b61553a4b\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.368946 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.371941 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b229a3b8-5243-4ec5-8970-d69b61553a4b-logs" (OuterVolumeSpecName: "logs") pod "b229a3b8-5243-4ec5-8970-d69b61553a4b" (UID: "b229a3b8-5243-4ec5-8970-d69b61553a4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.374834 4892 scope.go:117] "RemoveContainer" containerID="7cd572b4c48e2d00e0bf40fa35179c82494a8dd0b0e406e69152b90ae4fedac2" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.383495 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b229a3b8-5243-4ec5-8970-d69b61553a4b-kube-api-access-5wv5k" (OuterVolumeSpecName: "kube-api-access-5wv5k") pod "b229a3b8-5243-4ec5-8970-d69b61553a4b" (UID: "b229a3b8-5243-4ec5-8970-d69b61553a4b"). InnerVolumeSpecName "kube-api-access-5wv5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.388722 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" (UID: "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.408506 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b229a3b8-5243-4ec5-8970-d69b61553a4b" (UID: "b229a3b8-5243-4ec5-8970-d69b61553a4b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.408825 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-scripts" (OuterVolumeSpecName: "scripts") pod "b229a3b8-5243-4ec5-8970-d69b61553a4b" (UID: "b229a3b8-5243-4ec5-8970-d69b61553a4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.475349 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-httpd-run\") pod \"ff0622c4-d0fd-4732-9418-58d60d081887\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.475423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-combined-ca-bundle\") pod \"ff0622c4-d0fd-4732-9418-58d60d081887\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.475484 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-scripts\") pod \"ff0622c4-d0fd-4732-9418-58d60d081887\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.475596 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ff0622c4-d0fd-4732-9418-58d60d081887\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.475620 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-config-data\") pod \"ff0622c4-d0fd-4732-9418-58d60d081887\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.477574 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-internal-tls-certs\") pod \"ff0622c4-d0fd-4732-9418-58d60d081887\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.477607 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq9mw\" (UniqueName: \"kubernetes.io/projected/ff0622c4-d0fd-4732-9418-58d60d081887-kube-api-access-vq9mw\") pod \"ff0622c4-d0fd-4732-9418-58d60d081887\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.477645 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-logs\") pod \"ff0622c4-d0fd-4732-9418-58d60d081887\" (UID: \"ff0622c4-d0fd-4732-9418-58d60d081887\") " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.478034 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wv5k\" (UniqueName: \"kubernetes.io/projected/b229a3b8-5243-4ec5-8970-d69b61553a4b-kube-api-access-5wv5k\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.478046 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b229a3b8-5243-4ec5-8970-d69b61553a4b-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.478054 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.478062 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.478069 4892 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.478432 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-logs" (OuterVolumeSpecName: "logs") pod "ff0622c4-d0fd-4732-9418-58d60d081887" (UID: "ff0622c4-d0fd-4732-9418-58d60d081887"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.478652 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff0622c4-d0fd-4732-9418-58d60d081887" (UID: "ff0622c4-d0fd-4732-9418-58d60d081887"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.483477 4892 scope.go:117] "RemoveContainer" containerID="1c98c68346c852c6b86bbbb1a4f325d28e904786eaea2e6661104c316db0a332" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.483699 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0622c4-d0fd-4732-9418-58d60d081887-kube-api-access-vq9mw" (OuterVolumeSpecName: "kube-api-access-vq9mw") pod "ff0622c4-d0fd-4732-9418-58d60d081887" (UID: "ff0622c4-d0fd-4732-9418-58d60d081887"). InnerVolumeSpecName "kube-api-access-vq9mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.483965 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-config-data" (OuterVolumeSpecName: "config-data") pod "b229a3b8-5243-4ec5-8970-d69b61553a4b" (UID: "b229a3b8-5243-4ec5-8970-d69b61553a4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.490418 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance") pod "ff0622c4-d0fd-4732-9418-58d60d081887" (UID: "ff0622c4-d0fd-4732-9418-58d60d081887"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.491197 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-scripts" (OuterVolumeSpecName: "scripts") pod "ff0622c4-d0fd-4732-9418-58d60d081887" (UID: "ff0622c4-d0fd-4732-9418-58d60d081887"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.497730 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b229a3b8-5243-4ec5-8970-d69b61553a4b" (UID: "b229a3b8-5243-4ec5-8970-d69b61553a4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.506987 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-config-data" (OuterVolumeSpecName: "config-data") pod "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" (UID: "7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.569689 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff0622c4-d0fd-4732-9418-58d60d081887" (UID: "ff0622c4-d0fd-4732-9418-58d60d081887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579820 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579850 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq9mw\" (UniqueName: \"kubernetes.io/projected/ff0622c4-d0fd-4732-9418-58d60d081887-kube-api-access-vq9mw\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579861 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579869 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579878 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b229a3b8-5243-4ec5-8970-d69b61553a4b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579885 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff0622c4-d0fd-4732-9418-58d60d081887-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579892 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579900 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.579908 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.581513 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b229a3b8-5243-4ec5-8970-d69b61553a4b" (UID: "b229a3b8-5243-4ec5-8970-d69b61553a4b"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.612924 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.621953 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff0622c4-d0fd-4732-9418-58d60d081887" (UID: "ff0622c4-d0fd-4732-9418-58d60d081887"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.633988 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.648605 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.649231 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="ceilometer-notification-agent" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649261 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="ceilometer-notification-agent" Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.649289 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="sg-core" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649302 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="sg-core" Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.649379 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon-log" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649393 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon-log" Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.649429 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="ceilometer-central-agent" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649441 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="ceilometer-central-agent" Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.649475 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="proxy-httpd" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649487 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="proxy-httpd" Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.649504 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" containerName="glance-log" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649516 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" containerName="glance-log" Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.649529 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" containerName="glance-httpd" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649542 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" containerName="glance-httpd" Oct 06 12:27:24 crc kubenswrapper[4892]: E1006 12:27:24.649571 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649585 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649894 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="ceilometer-central-agent" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649919 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="proxy-httpd" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649953 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="sg-core" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.649980 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" containerName="glance-log" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.650006 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" containerName="glance-httpd" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.650026 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" containerName="ceilometer-notification-agent" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.650051 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon-log" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.650069 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" containerName="horizon" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.660242 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.660739 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.670782 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.671025 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.681751 4892 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b229a3b8-5243-4ec5-8970-d69b61553a4b-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.681777 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.682388 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.703901 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-config-data" (OuterVolumeSpecName: "config-data") pod "ff0622c4-d0fd-4732-9418-58d60d081887" (UID: "ff0622c4-d0fd-4732-9418-58d60d081887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.718199 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-g72j5"] Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.733386 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-q7svw"] Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.740252 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hbr5f"] Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783064 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-config-data\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783226 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783268 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bsx4\" (UniqueName: \"kubernetes.io/projected/4dfba438-0442-4bbc-a2ac-723505ee5984-kube-api-access-7bsx4\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783287 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-run-httpd\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783308 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-log-httpd\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783352 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783437 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-scripts\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783486 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.783496 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff0622c4-d0fd-4732-9418-58d60d081887-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.884447 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-scripts\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.884510 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-config-data\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.884571 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.884605 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bsx4\" (UniqueName: \"kubernetes.io/projected/4dfba438-0442-4bbc-a2ac-723505ee5984-kube-api-access-7bsx4\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.884626 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-run-httpd\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.884645 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-log-httpd\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.884659 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.886050 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-run-httpd\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.886147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-log-httpd\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.892011 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.892317 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.893224 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-scripts\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.898078 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-config-data\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:24 crc kubenswrapper[4892]: I1006 12:27:24.901762 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bsx4\" (UniqueName: \"kubernetes.io/projected/4dfba438-0442-4bbc-a2ac-723505ee5984-kube-api-access-7bsx4\") pod \"ceilometer-0\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " pod="openstack/ceilometer-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.005754 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.231552 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"455b9aa7-eeb4-40aa-ae93-872b577730d4","Type":"ContainerStarted","Data":"17513f38162f888ac8fe00ed7de663f015912c8bf24aefa05de2f13ad9a92af4"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.231880 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.231886 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api-log" containerID="cri-o://1142e5a9f6af62beb67e814f2f19a17657ea6b831ae2cadca200c1e14e6d7943" gracePeriod=30 Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.232013 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api" containerID="cri-o://17513f38162f888ac8fe00ed7de663f015912c8bf24aefa05de2f13ad9a92af4" gracePeriod=30 Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.239049 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerStarted","Data":"4c4118ce3a92264a7a65267fe65ae69f17c7790f9887cfafc5cb9301606868b4"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.245963 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hbr5f" event={"ID":"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a","Type":"ContainerStarted","Data":"dc2ee1d288172ce961b1b45c6874ef8af9788f61038164e0770605e25c3aac82"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.246006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hbr5f" event={"ID":"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a","Type":"ContainerStarted","Data":"05030b9e3f08027121a7b4e90990a25f1ee54ab491939c96d1d2bc06f8de80b2"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.258063 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff0622c4-d0fd-4732-9418-58d60d081887","Type":"ContainerDied","Data":"0c21654092ab3ed0763f730b4898335e595386df4d748289b1cb52954f75d3d4"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.258113 4892 scope.go:117] "RemoveContainer" containerID="5d81adafc0eea76a3209fb496c1db4db2242eb936080e4c7e6cf5a46abf4fbc2" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.258218 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.269023 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=10.269001731 podStartE2EDuration="10.269001731s" podCreationTimestamp="2025-10-06 12:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:25.25382878 +0000 UTC m=+1131.803534545" watchObservedRunningTime="2025-10-06 12:27:25.269001731 +0000 UTC m=+1131.818707496" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.288670 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7489d9984-82d5x" event={"ID":"b229a3b8-5243-4ec5-8970-d69b61553a4b","Type":"ContainerDied","Data":"0744491d33296a0800e151123d41ec10745a8abeabfa6ccf6920e4684577d078"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.288775 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7489d9984-82d5x" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.305545 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-g72j5" event={"ID":"be99d63c-4bdd-4dae-a003-5215816244ac","Type":"ContainerStarted","Data":"cf408c0fdd683bb3f656bf0cb6313060eac66780bb972b183b8321048737d8a4"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.305588 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-g72j5" event={"ID":"be99d63c-4bdd-4dae-a003-5215816244ac","Type":"ContainerStarted","Data":"0ae0826483a1dc09443e4dbef83d7a359d7c5dc6b4373062d41e21ff319eb758"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.308604 4892 scope.go:117] "RemoveContainer" containerID="b68b173207667e8ecfc2b1fda1ce77722b03d173575c781ce659aa479a985d2e" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.309690 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-hbr5f" podStartSLOduration=5.309666263 podStartE2EDuration="5.309666263s" podCreationTimestamp="2025-10-06 12:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:25.288539889 +0000 UTC m=+1131.838245674" watchObservedRunningTime="2025-10-06 12:27:25.309666263 +0000 UTC m=+1131.859372028" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.320357 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"88899d7e-63bd-4092-a2e5-81974383d714","Type":"ContainerStarted","Data":"526233d1c11da295e6903ea3d5f4a8f3ccb0f1f6d8ff9c0f0704e931e81a8ff7"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.326938 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-g72j5" podStartSLOduration=5.326913215 podStartE2EDuration="5.326913215s" podCreationTimestamp="2025-10-06 12:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:25.320679793 +0000 UTC m=+1131.870385558" watchObservedRunningTime="2025-10-06 12:27:25.326913215 +0000 UTC m=+1131.876618980" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.331883 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q7svw" event={"ID":"367eb665-c929-4ea9-8fb5-cd23cd430278","Type":"ContainerStarted","Data":"f59939b48aa37657cafd0b0c0f232a4eae9dbca6620699a77ce5538aa0471113"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.331933 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q7svw" event={"ID":"367eb665-c929-4ea9-8fb5-cd23cd430278","Type":"ContainerStarted","Data":"cc2b42523c04f1c0d9fdacd30f0208a7c6b8678a85125f46fd25a11eef09a6e3"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.358077 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6950e2ad-e715-42cf-ae66-ac50bc683bf1","Type":"ContainerStarted","Data":"05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849"} Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.361651 4892 scope.go:117] "RemoveContainer" containerID="18eac723022a92e25d1293baa3888a63d89e6f7b88835b26ef947985f501f48b" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.372000 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.394164 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.38705809 podStartE2EDuration="16.394141619s" podCreationTimestamp="2025-10-06 12:27:09 +0000 UTC" firstStartedPulling="2025-10-06 12:27:10.923809704 +0000 UTC m=+1117.473515469" lastFinishedPulling="2025-10-06 12:27:23.930893213 +0000 UTC m=+1130.480598998" observedRunningTime="2025-10-06 12:27:25.334504935 +0000 UTC m=+1131.884210700" watchObservedRunningTime="2025-10-06 12:27:25.394141619 +0000 UTC m=+1131.943847384" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.412185 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.433547 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.188:8080/\": dial tcp 10.217.0.188:8080: connect: connection refused" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.455870 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7489d9984-82d5x"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.473541 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7489d9984-82d5x"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.483090 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.495549 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.514301 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.516197 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.518835 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.523466 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.525129 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-q7svw" podStartSLOduration=5.525113706 podStartE2EDuration="5.525113706s" podCreationTimestamp="2025-10-06 12:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:25.408202208 +0000 UTC m=+1131.957907973" watchObservedRunningTime="2025-10-06 12:27:25.525113706 +0000 UTC m=+1132.074819471" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.542567 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.564634 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.565473 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.574077 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.575735 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.577394 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.577730 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.578137 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.583695 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=10.351117177999999 podStartE2EDuration="10.583678919s" podCreationTimestamp="2025-10-06 12:27:15 +0000 UTC" firstStartedPulling="2025-10-06 12:27:16.099193079 +0000 UTC m=+1122.648898844" lastFinishedPulling="2025-10-06 12:27:16.33175482 +0000 UTC m=+1122.881460585" observedRunningTime="2025-10-06 12:27:25.462912318 +0000 UTC m=+1132.012618073" watchObservedRunningTime="2025-10-06 12:27:25.583678919 +0000 UTC m=+1132.133384684" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.594396 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.603725 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.606842 4892 scope.go:117] "RemoveContainer" containerID="b929f09ed20951199fcf75f8cafe3036227c5cbc150094f8ea1bac9f3e6ae072" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.622807 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/972058aa-0e97-4041-96c6-41acbb31d3ce-logs\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.622862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.622912 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.622934 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xctgq\" (UniqueName: \"kubernetes.io/projected/230bad23-f208-488d-90ae-dcdf6c56fa64-kube-api-access-xctgq\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.622953 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-config-data\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.622972 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/230bad23-f208-488d-90ae-dcdf6c56fa64-logs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.622993 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-public-tls-certs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.623015 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dm2t\" (UniqueName: \"kubernetes.io/projected/972058aa-0e97-4041-96c6-41acbb31d3ce-kube-api-access-8dm2t\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.623035 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.623063 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.623219 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/972058aa-0e97-4041-96c6-41acbb31d3ce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.623267 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.623344 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.623371 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.623397 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.625425 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.696006 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78555bc94f-rgqkh"] Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.696249 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" podUID="cac926c3-363d-4d97-ad7a-aab9960020ea" containerName="dnsmasq-dns" containerID="cri-o://8c04c36214f42adb11633d352a79a289d237e7b151efdb3764be56ff6b870e37" gracePeriod=10 Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.726980 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.727025 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730190 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730257 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/972058aa-0e97-4041-96c6-41acbb31d3ce-logs\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730340 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730454 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730475 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xctgq\" (UniqueName: \"kubernetes.io/projected/230bad23-f208-488d-90ae-dcdf6c56fa64-kube-api-access-xctgq\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730497 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-config-data\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730527 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/230bad23-f208-488d-90ae-dcdf6c56fa64-logs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730561 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-public-tls-certs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730602 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dm2t\" (UniqueName: \"kubernetes.io/projected/972058aa-0e97-4041-96c6-41acbb31d3ce-kube-api-access-8dm2t\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730627 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730682 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730739 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/972058aa-0e97-4041-96c6-41acbb31d3ce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.730774 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.736672 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/230bad23-f208-488d-90ae-dcdf6c56fa64-logs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.738175 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/972058aa-0e97-4041-96c6-41acbb31d3ce-logs\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.739507 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.739947 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.754178 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.755050 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.756670 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-public-tls-certs\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.758226 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-config-data\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.758894 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/972058aa-0e97-4041-96c6-41acbb31d3ce-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.759385 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-scripts\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.771222 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.774361 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230bad23-f208-488d-90ae-dcdf6c56fa64-config-data\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.792760 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xctgq\" (UniqueName: \"kubernetes.io/projected/230bad23-f208-488d-90ae-dcdf6c56fa64-kube-api-access-xctgq\") pod \"watcher-api-0\" (UID: \"230bad23-f208-488d-90ae-dcdf6c56fa64\") " pod="openstack/watcher-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.806991 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/972058aa-0e97-4041-96c6-41acbb31d3ce-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.825119 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.831177 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dm2t\" (UniqueName: \"kubernetes.io/projected/972058aa-0e97-4041-96c6-41acbb31d3ce-kube-api-access-8dm2t\") pod \"glance-default-internal-api-0\" (UID: \"972058aa-0e97-4041-96c6-41acbb31d3ce\") " pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.922753 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:25 crc kubenswrapper[4892]: I1006 12:27:25.933626 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.192468 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97" path="/var/lib/kubelet/pods/7b756ba9-5c1a-44a1-8d2f-57fa3a9f0e97/volumes" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.202057 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a61c1481-8c3c-490a-97eb-a03156bb7ee5" path="/var/lib/kubelet/pods/a61c1481-8c3c-490a-97eb-a03156bb7ee5/volumes" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.203197 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b229a3b8-5243-4ec5-8970-d69b61553a4b" path="/var/lib/kubelet/pods/b229a3b8-5243-4ec5-8970-d69b61553a4b/volumes" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.204039 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0622c4-d0fd-4732-9418-58d60d081887" path="/var/lib/kubelet/pods/ff0622c4-d0fd-4732-9418-58d60d081887/volumes" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.205873 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.206159 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerName="glance-log" containerID="cri-o://646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5" gracePeriod=30 Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.207090 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerName="glance-httpd" containerID="cri-o://82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348" gracePeriod=30 Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.307183 4892 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod70af3608-42f9-456b-9035-3030027e04ca"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod70af3608-42f9-456b-9035-3030027e04ca] : Timed out while waiting for systemd to remove kubepods-besteffort-pod70af3608_42f9_456b_9035_3030027e04ca.slice" Oct 06 12:27:26 crc kubenswrapper[4892]: E1006 12:27:26.307444 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod70af3608-42f9-456b-9035-3030027e04ca] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod70af3608-42f9-456b-9035-3030027e04ca] : Timed out while waiting for systemd to remove kubepods-besteffort-pod70af3608_42f9_456b_9035_3030027e04ca.slice" pod="openstack/horizon-7976c9f5c7-4g42j" podUID="70af3608-42f9-456b-9035-3030027e04ca" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.432723 4892 generic.go:334] "Generic (PLEG): container finished" podID="be99d63c-4bdd-4dae-a003-5215816244ac" containerID="cf408c0fdd683bb3f656bf0cb6313060eac66780bb972b183b8321048737d8a4" exitCode=0 Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.432798 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-g72j5" event={"ID":"be99d63c-4bdd-4dae-a003-5215816244ac","Type":"ContainerDied","Data":"cf408c0fdd683bb3f656bf0cb6313060eac66780bb972b183b8321048737d8a4"} Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.434084 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.445832 4892 generic.go:334] "Generic (PLEG): container finished" podID="cac926c3-363d-4d97-ad7a-aab9960020ea" containerID="8c04c36214f42adb11633d352a79a289d237e7b151efdb3764be56ff6b870e37" exitCode=0 Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.445929 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" event={"ID":"cac926c3-363d-4d97-ad7a-aab9960020ea","Type":"ContainerDied","Data":"8c04c36214f42adb11633d352a79a289d237e7b151efdb3764be56ff6b870e37"} Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.445969 4892 scope.go:117] "RemoveContainer" containerID="8c04c36214f42adb11633d352a79a289d237e7b151efdb3764be56ff6b870e37" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.447871 4892 generic.go:334] "Generic (PLEG): container finished" podID="b63c64d6-eb6d-4370-abd8-b4eb8487ae8a" containerID="dc2ee1d288172ce961b1b45c6874ef8af9788f61038164e0770605e25c3aac82" exitCode=0 Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.447923 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hbr5f" event={"ID":"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a","Type":"ContainerDied","Data":"dc2ee1d288172ce961b1b45c6874ef8af9788f61038164e0770605e25c3aac82"} Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.458146 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-config\") pod \"cac926c3-363d-4d97-ad7a-aab9960020ea\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.458385 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-swift-storage-0\") pod \"cac926c3-363d-4d97-ad7a-aab9960020ea\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.458474 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql5hr\" (UniqueName: \"kubernetes.io/projected/cac926c3-363d-4d97-ad7a-aab9960020ea-kube-api-access-ql5hr\") pod \"cac926c3-363d-4d97-ad7a-aab9960020ea\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.458570 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-nb\") pod \"cac926c3-363d-4d97-ad7a-aab9960020ea\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.458683 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-sb\") pod \"cac926c3-363d-4d97-ad7a-aab9960020ea\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.458776 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-svc\") pod \"cac926c3-363d-4d97-ad7a-aab9960020ea\" (UID: \"cac926c3-363d-4d97-ad7a-aab9960020ea\") " Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.466851 4892 generic.go:334] "Generic (PLEG): container finished" podID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerID="1142e5a9f6af62beb67e814f2f19a17657ea6b831ae2cadca200c1e14e6d7943" exitCode=143 Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.466961 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"455b9aa7-eeb4-40aa-ae93-872b577730d4","Type":"ContainerDied","Data":"1142e5a9f6af62beb67e814f2f19a17657ea6b831ae2cadca200c1e14e6d7943"} Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.484368 4892 scope.go:117] "RemoveContainer" containerID="dcf966df5f31628b89240fac42576421acece51a4aed8d4aa9a6cd0a14fcdcba" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.484422 4892 generic.go:334] "Generic (PLEG): container finished" podID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerID="646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5" exitCode=143 Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.484514 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875","Type":"ContainerDied","Data":"646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5"} Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.493210 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac926c3-363d-4d97-ad7a-aab9960020ea-kube-api-access-ql5hr" (OuterVolumeSpecName: "kube-api-access-ql5hr") pod "cac926c3-363d-4d97-ad7a-aab9960020ea" (UID: "cac926c3-363d-4d97-ad7a-aab9960020ea"). InnerVolumeSpecName "kube-api-access-ql5hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.501262 4892 generic.go:334] "Generic (PLEG): container finished" podID="367eb665-c929-4ea9-8fb5-cd23cd430278" containerID="f59939b48aa37657cafd0b0c0f232a4eae9dbca6620699a77ce5538aa0471113" exitCode=0 Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.501376 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q7svw" event={"ID":"367eb665-c929-4ea9-8fb5-cd23cd430278","Type":"ContainerDied","Data":"f59939b48aa37657cafd0b0c0f232a4eae9dbca6620699a77ce5538aa0471113"} Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.510726 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerStarted","Data":"e65a89b094c281393de9f17af404aa8e3f0d25c8c1096b8fe524d160ae7639ef"} Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.510762 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerStarted","Data":"8b0469c61f02d91f0410cd850ff06ca65f42c7e27be8227cdcdf7aad09fc8958"} Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.511214 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7976c9f5c7-4g42j" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.554527 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cac926c3-363d-4d97-ad7a-aab9960020ea" (UID: "cac926c3-363d-4d97-ad7a-aab9960020ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.565229 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql5hr\" (UniqueName: \"kubernetes.io/projected/cac926c3-363d-4d97-ad7a-aab9960020ea-kube-api-access-ql5hr\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.565258 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.568144 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7976c9f5c7-4g42j"] Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.581771 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cac926c3-363d-4d97-ad7a-aab9960020ea" (UID: "cac926c3-363d-4d97-ad7a-aab9960020ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.582201 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cac926c3-363d-4d97-ad7a-aab9960020ea" (UID: "cac926c3-363d-4d97-ad7a-aab9960020ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.593719 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7976c9f5c7-4g42j"] Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.599980 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-config" (OuterVolumeSpecName: "config") pod "cac926c3-363d-4d97-ad7a-aab9960020ea" (UID: "cac926c3-363d-4d97-ad7a-aab9960020ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.600232 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cac926c3-363d-4d97-ad7a-aab9960020ea" (UID: "cac926c3-363d-4d97-ad7a-aab9960020ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.666747 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.666771 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.666780 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.666789 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cac926c3-363d-4d97-ad7a-aab9960020ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.940052 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 06 12:27:26 crc kubenswrapper[4892]: I1006 12:27:26.972266 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-696599f45c-ndwj5" Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.047812 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68f547b878-th9sd"] Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.048276 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68f547b878-th9sd" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerName="neutron-api" containerID="cri-o://bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6" gracePeriod=30 Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.048692 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68f547b878-th9sd" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerName="neutron-httpd" containerID="cri-o://8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8" gracePeriod=30 Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.075662 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.518710 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"972058aa-0e97-4041-96c6-41acbb31d3ce","Type":"ContainerStarted","Data":"fa91e02bf13ec02b6c2236fe946a64c0f8a2a214306ff077a5bd1222231263d8"} Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.520630 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"230bad23-f208-488d-90ae-dcdf6c56fa64","Type":"ContainerStarted","Data":"f91ab657a9bff47021a7038e273098d1b5c098dd36d2cd013f840821a9b8b8e8"} Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.520674 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"230bad23-f208-488d-90ae-dcdf6c56fa64","Type":"ContainerStarted","Data":"11d101ef942678781be1f0067563b2a8e64113b31fbd80bbb811a899fb204898"} Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.522213 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.522223 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78555bc94f-rgqkh" event={"ID":"cac926c3-363d-4d97-ad7a-aab9960020ea","Type":"ContainerDied","Data":"cadc9025aab3587f956745adc3c95e27f7519b9bb499e1ca834941f520c781f1"} Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.524244 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerStarted","Data":"1f345ce56d6cf1b21a442c68ba997d821ef6f93d18ac598c8bf2399c9479bb8d"} Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.528205 4892 generic.go:334] "Generic (PLEG): container finished" podID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerID="8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8" exitCode=0 Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.528377 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f547b878-th9sd" event={"ID":"c09ee4a5-59be-4a2c-8701-a572cc9a71ec","Type":"ContainerDied","Data":"8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8"} Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.566275 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78555bc94f-rgqkh"] Oct 06 12:27:27 crc kubenswrapper[4892]: I1006 12:27:27.571488 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78555bc94f-rgqkh"] Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.058530 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.098194 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-g72j5" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.187815 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70af3608-42f9-456b-9035-3030027e04ca" path="/var/lib/kubelet/pods/70af3608-42f9-456b-9035-3030027e04ca/volumes" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.188412 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac926c3-363d-4d97-ad7a-aab9960020ea" path="/var/lib/kubelet/pods/cac926c3-363d-4d97-ad7a-aab9960020ea/volumes" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.193523 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.220239 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vvk9\" (UniqueName: \"kubernetes.io/projected/be99d63c-4bdd-4dae-a003-5215816244ac-kube-api-access-9vvk9\") pod \"be99d63c-4bdd-4dae-a003-5215816244ac\" (UID: \"be99d63c-4bdd-4dae-a003-5215816244ac\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.226726 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be99d63c-4bdd-4dae-a003-5215816244ac-kube-api-access-9vvk9" (OuterVolumeSpecName: "kube-api-access-9vvk9") pod "be99d63c-4bdd-4dae-a003-5215816244ac" (UID: "be99d63c-4bdd-4dae-a003-5215816244ac"). InnerVolumeSpecName "kube-api-access-9vvk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.251697 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hbr5f" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.265984 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q7svw" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.322932 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9ld7\" (UniqueName: \"kubernetes.io/projected/b63c64d6-eb6d-4370-abd8-b4eb8487ae8a-kube-api-access-q9ld7\") pod \"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a\" (UID: \"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.323147 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rkz2\" (UniqueName: \"kubernetes.io/projected/367eb665-c929-4ea9-8fb5-cd23cd430278-kube-api-access-9rkz2\") pod \"367eb665-c929-4ea9-8fb5-cd23cd430278\" (UID: \"367eb665-c929-4ea9-8fb5-cd23cd430278\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.323544 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vvk9\" (UniqueName: \"kubernetes.io/projected/be99d63c-4bdd-4dae-a003-5215816244ac-kube-api-access-9vvk9\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.326513 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367eb665-c929-4ea9-8fb5-cd23cd430278-kube-api-access-9rkz2" (OuterVolumeSpecName: "kube-api-access-9rkz2") pod "367eb665-c929-4ea9-8fb5-cd23cd430278" (UID: "367eb665-c929-4ea9-8fb5-cd23cd430278"). InnerVolumeSpecName "kube-api-access-9rkz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.327104 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63c64d6-eb6d-4370-abd8-b4eb8487ae8a-kube-api-access-q9ld7" (OuterVolumeSpecName: "kube-api-access-q9ld7") pod "b63c64d6-eb6d-4370-abd8-b4eb8487ae8a" (UID: "b63c64d6-eb6d-4370-abd8-b4eb8487ae8a"). InnerVolumeSpecName "kube-api-access-q9ld7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.427566 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9ld7\" (UniqueName: \"kubernetes.io/projected/b63c64d6-eb6d-4370-abd8-b4eb8487ae8a-kube-api-access-q9ld7\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.427603 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rkz2\" (UniqueName: \"kubernetes.io/projected/367eb665-c929-4ea9-8fb5-cd23cd430278-kube-api-access-9rkz2\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.548634 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.548974 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"972058aa-0e97-4041-96c6-41acbb31d3ce","Type":"ContainerStarted","Data":"a66b2d781edb45fc3bdbf4f7b3025735f555b5bbbe82a036ec6b65062d404f84"} Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.554235 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-g72j5" event={"ID":"be99d63c-4bdd-4dae-a003-5215816244ac","Type":"ContainerDied","Data":"0ae0826483a1dc09443e4dbef83d7a359d7c5dc6b4373062d41e21ff319eb758"} Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.554267 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae0826483a1dc09443e4dbef83d7a359d7c5dc6b4373062d41e21ff319eb758" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.554336 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-g72j5" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.556297 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-q7svw" event={"ID":"367eb665-c929-4ea9-8fb5-cd23cd430278","Type":"ContainerDied","Data":"cc2b42523c04f1c0d9fdacd30f0208a7c6b8678a85125f46fd25a11eef09a6e3"} Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.556318 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc2b42523c04f1c0d9fdacd30f0208a7c6b8678a85125f46fd25a11eef09a6e3" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.556374 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-q7svw" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.572762 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"230bad23-f208-488d-90ae-dcdf6c56fa64","Type":"ContainerStarted","Data":"ed122f200a96b590701a64dbbc1836bc0e555684369da02785f54c06fc92fcad"} Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.574458 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.600533 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerStarted","Data":"9a76eb98d87027ad0183d5ee6923cb22f1c5e77badf7d4668d781e7b2cc4e26f"} Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.607848 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.607829749 podStartE2EDuration="3.607829749s" podCreationTimestamp="2025-10-06 12:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:28.597568301 +0000 UTC m=+1135.147274066" watchObservedRunningTime="2025-10-06 12:27:28.607829749 +0000 UTC m=+1135.157535504" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632029 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-config-data\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632121 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-logs\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632146 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-combined-ca-bundle\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632182 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-public-tls-certs\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632311 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-httpd-run\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632379 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632409 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lkd8\" (UniqueName: \"kubernetes.io/projected/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-kube-api-access-6lkd8\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632448 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-scripts\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.632867 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-logs" (OuterVolumeSpecName: "logs") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.683539 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-scripts" (OuterVolumeSpecName: "scripts") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.689040 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hbr5f" event={"ID":"b63c64d6-eb6d-4370-abd8-b4eb8487ae8a","Type":"ContainerDied","Data":"05030b9e3f08027121a7b4e90990a25f1ee54ab491939c96d1d2bc06f8de80b2"} Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.689090 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05030b9e3f08027121a7b4e90990a25f1ee54ab491939c96d1d2bc06f8de80b2" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.689170 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hbr5f" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.692886 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.699687 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.699713 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-kube-api-access-6lkd8" (OuterVolumeSpecName: "kube-api-access-6lkd8") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "kube-api-access-6lkd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.722253 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.724763 4892 generic.go:334] "Generic (PLEG): container finished" podID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerID="82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348" exitCode=0 Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.724840 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.724894 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875","Type":"ContainerDied","Data":"82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348"} Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.724921 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875","Type":"ContainerDied","Data":"a88b903cabb40e74dfef5426fd2b09ac76cfc9de1f1f740238dd8eae8ce1b57c"} Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.724936 4892 scope.go:117] "RemoveContainer" containerID="82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.725085 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.734240 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.734265 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.734276 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.734350 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.734363 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lkd8\" (UniqueName: \"kubernetes.io/projected/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-kube-api-access-6lkd8\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.734371 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.766064 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.776165 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.819424 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.835383 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-config-data" (OuterVolumeSpecName: "config-data") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.835664 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-config-data\") pod \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\" (UID: \"c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875\") " Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.836099 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: W1006 12:27:28.836457 4892 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875/volumes/kubernetes.io~secret/config-data Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.836471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-config-data" (OuterVolumeSpecName: "config-data") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.856826 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" (UID: "c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.939381 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.939430 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.963839 4892 scope.go:117] "RemoveContainer" containerID="646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.989852 4892 scope.go:117] "RemoveContainer" containerID="82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348" Oct 06 12:27:28 crc kubenswrapper[4892]: E1006 12:27:28.990223 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348\": container with ID starting with 82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348 not found: ID does not exist" containerID="82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.990259 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348"} err="failed to get container status \"82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348\": rpc error: code = NotFound desc = could not find container \"82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348\": container with ID starting with 82a8048f52bdf70667f97805e72f278a77e5beccde38b3fece00547d58a6a348 not found: ID does not exist" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.990280 4892 scope.go:117] "RemoveContainer" containerID="646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5" Oct 06 12:27:28 crc kubenswrapper[4892]: E1006 12:27:28.990580 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5\": container with ID starting with 646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5 not found: ID does not exist" containerID="646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5" Oct 06 12:27:28 crc kubenswrapper[4892]: I1006 12:27:28.990601 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5"} err="failed to get container status \"646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5\": rpc error: code = NotFound desc = could not find container \"646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5\": container with ID starting with 646afd4dbece14c24905780ea40316c7a8528cacaa6ef3d64b08a9aa907b65a5 not found: ID does not exist" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.064937 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.071694 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084197 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:27:29 crc kubenswrapper[4892]: E1006 12:27:29.084596 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerName="glance-httpd" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084612 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerName="glance-httpd" Oct 06 12:27:29 crc kubenswrapper[4892]: E1006 12:27:29.084629 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerName="glance-log" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084636 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerName="glance-log" Oct 06 12:27:29 crc kubenswrapper[4892]: E1006 12:27:29.084654 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be99d63c-4bdd-4dae-a003-5215816244ac" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084662 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="be99d63c-4bdd-4dae-a003-5215816244ac" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: E1006 12:27:29.084672 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367eb665-c929-4ea9-8fb5-cd23cd430278" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084677 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="367eb665-c929-4ea9-8fb5-cd23cd430278" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: E1006 12:27:29.084685 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac926c3-363d-4d97-ad7a-aab9960020ea" containerName="dnsmasq-dns" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084691 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac926c3-363d-4d97-ad7a-aab9960020ea" containerName="dnsmasq-dns" Oct 06 12:27:29 crc kubenswrapper[4892]: E1006 12:27:29.084705 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac926c3-363d-4d97-ad7a-aab9960020ea" containerName="init" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084710 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac926c3-363d-4d97-ad7a-aab9960020ea" containerName="init" Oct 06 12:27:29 crc kubenswrapper[4892]: E1006 12:27:29.084719 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63c64d6-eb6d-4370-abd8-b4eb8487ae8a" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084726 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63c64d6-eb6d-4370-abd8-b4eb8487ae8a" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084892 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerName="glance-httpd" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084904 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="be99d63c-4bdd-4dae-a003-5215816244ac" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084913 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63c64d6-eb6d-4370-abd8-b4eb8487ae8a" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084923 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac926c3-363d-4d97-ad7a-aab9960020ea" containerName="dnsmasq-dns" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084933 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="367eb665-c929-4ea9-8fb5-cd23cd430278" containerName="mariadb-database-create" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.084939 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" containerName="glance-log" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.086745 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.088668 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.088874 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.100231 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.142104 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k844g\" (UniqueName: \"kubernetes.io/projected/7995e823-ea1a-4dce-bd8b-693d5e835a10-kube-api-access-k844g\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.142145 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.142178 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-config-data\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.142346 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7995e823-ea1a-4dce-bd8b-693d5e835a10-logs\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.142411 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.142463 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.142525 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7995e823-ea1a-4dce-bd8b-693d5e835a10-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.142553 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-scripts\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.244796 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k844g\" (UniqueName: \"kubernetes.io/projected/7995e823-ea1a-4dce-bd8b-693d5e835a10-kube-api-access-k844g\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.244850 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.244915 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-config-data\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.244977 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7995e823-ea1a-4dce-bd8b-693d5e835a10-logs\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.245009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.245060 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.245112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7995e823-ea1a-4dce-bd8b-693d5e835a10-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.245132 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-scripts\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.245777 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7995e823-ea1a-4dce-bd8b-693d5e835a10-logs\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.245819 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.246004 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7995e823-ea1a-4dce-bd8b-693d5e835a10-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.249911 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.250000 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.251562 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-scripts\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.253222 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7995e823-ea1a-4dce-bd8b-693d5e835a10-config-data\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.265006 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k844g\" (UniqueName: \"kubernetes.io/projected/7995e823-ea1a-4dce-bd8b-693d5e835a10-kube-api-access-k844g\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.288737 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"7995e823-ea1a-4dce-bd8b-693d5e835a10\") " pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.400888 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.735872 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"972058aa-0e97-4041-96c6-41acbb31d3ce","Type":"ContainerStarted","Data":"c954878685d792bdb3829f5eda68c26fe08faeacacd7e317db19a235dc1ab63c"} Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.739243 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerStarted","Data":"0b6a38dee4ffab656714bcea9673d2773cc49a33c0f21efb644743991165dad6"} Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.758825 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.758803757 podStartE2EDuration="4.758803757s" podCreationTimestamp="2025-10-06 12:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:29.757086757 +0000 UTC m=+1136.306792542" watchObservedRunningTime="2025-10-06 12:27:29.758803757 +0000 UTC m=+1136.308509522" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.779685 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.573303187 podStartE2EDuration="5.779667834s" podCreationTimestamp="2025-10-06 12:27:24 +0000 UTC" firstStartedPulling="2025-10-06 12:27:25.638870323 +0000 UTC m=+1132.188576088" lastFinishedPulling="2025-10-06 12:27:28.84523497 +0000 UTC m=+1135.394940735" observedRunningTime="2025-10-06 12:27:29.776585324 +0000 UTC m=+1136.326291089" watchObservedRunningTime="2025-10-06 12:27:29.779667834 +0000 UTC m=+1136.329373599" Oct 06 12:27:29 crc kubenswrapper[4892]: I1006 12:27:29.968446 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 06 12:27:29 crc kubenswrapper[4892]: W1006 12:27:29.970572 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7995e823_ea1a_4dce_bd8b_693d5e835a10.slice/crio-1384791280a84303186edd553c5ffdd7716e4ae04ca7e44d4c1474fe9fc5f857 WatchSource:0}: Error finding container 1384791280a84303186edd553c5ffdd7716e4ae04ca7e44d4c1474fe9fc5f857: Status 404 returned error can't find the container with id 1384791280a84303186edd553c5ffdd7716e4ae04ca7e44d4c1474fe9fc5f857 Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.181598 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875" path="/var/lib/kubelet/pods/c7b9680e-f7d8-4e5e-bfe4-bcbb51d3c875/volumes" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.320099 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c7bd-account-create-xgwv5"] Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.321224 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7bd-account-create-xgwv5" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.323365 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.330611 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7bd-account-create-xgwv5"] Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.468829 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvwm\" (UniqueName: \"kubernetes.io/projected/4541a67b-e69c-4650-9de1-db5abe24d73b-kube-api-access-2tvwm\") pod \"nova-api-c7bd-account-create-xgwv5\" (UID: \"4541a67b-e69c-4650-9de1-db5abe24d73b\") " pod="openstack/nova-api-c7bd-account-create-xgwv5" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.571366 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvwm\" (UniqueName: \"kubernetes.io/projected/4541a67b-e69c-4650-9de1-db5abe24d73b-kube-api-access-2tvwm\") pod \"nova-api-c7bd-account-create-xgwv5\" (UID: \"4541a67b-e69c-4650-9de1-db5abe24d73b\") " pod="openstack/nova-api-c7bd-account-create-xgwv5" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.585613 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.590466 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvwm\" (UniqueName: \"kubernetes.io/projected/4541a67b-e69c-4650-9de1-db5abe24d73b-kube-api-access-2tvwm\") pod \"nova-api-c7bd-account-create-xgwv5\" (UID: \"4541a67b-e69c-4650-9de1-db5abe24d73b\") " pod="openstack/nova-api-c7bd-account-create-xgwv5" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.630155 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.638273 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7bd-account-create-xgwv5" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.793029 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.793579 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7995e823-ea1a-4dce-bd8b-693d5e835a10","Type":"ContainerStarted","Data":"3fab8246807f19dcf31a4e1552e826e22f983ff566676bf2fcf67125dcb8e63f"} Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.793605 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7995e823-ea1a-4dce-bd8b-693d5e835a10","Type":"ContainerStarted","Data":"1384791280a84303186edd553c5ffdd7716e4ae04ca7e44d4c1474fe9fc5f857"} Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.794028 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="cinder-scheduler" containerID="cri-o://9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8" gracePeriod=30 Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.794113 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" containerID="cri-o://4c4118ce3a92264a7a65267fe65ae69f17c7790f9887cfafc5cb9301606868b4" gracePeriod=30 Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.794370 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="probe" containerID="cri-o://05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849" gracePeriod=30 Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.795530 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.934833 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 06 12:27:30 crc kubenswrapper[4892]: I1006 12:27:30.986782 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.124275 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c7bd-account-create-xgwv5"] Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.724352 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.818393 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7995e823-ea1a-4dce-bd8b-693d5e835a10","Type":"ContainerStarted","Data":"7980284f4de4883f4417bdf8e18091ab14c367715f2a6f0d5595d186f6a0fae2"} Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.826479 4892 generic.go:334] "Generic (PLEG): container finished" podID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerID="05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849" exitCode=0 Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.826553 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6950e2ad-e715-42cf-ae66-ac50bc683bf1","Type":"ContainerDied","Data":"05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849"} Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.829472 4892 generic.go:334] "Generic (PLEG): container finished" podID="4541a67b-e69c-4650-9de1-db5abe24d73b" containerID="9e18ba5ab5c80f829005cdc295de4854d0af6eb32212f092dc3c3457b110d4a3" exitCode=0 Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.829522 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7bd-account-create-xgwv5" event={"ID":"4541a67b-e69c-4650-9de1-db5abe24d73b","Type":"ContainerDied","Data":"9e18ba5ab5c80f829005cdc295de4854d0af6eb32212f092dc3c3457b110d4a3"} Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.829547 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7bd-account-create-xgwv5" event={"ID":"4541a67b-e69c-4650-9de1-db5abe24d73b","Type":"ContainerStarted","Data":"590ca4b5412da03f2a1d45548589211afeed9b47b9cdd58f91b4ca0bdcc7e20c"} Oct 06 12:27:31 crc kubenswrapper[4892]: I1006 12:27:31.839520 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.839505202 podStartE2EDuration="2.839505202s" podCreationTimestamp="2025-10-06 12:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:31.83326622 +0000 UTC m=+1138.382971995" watchObservedRunningTime="2025-10-06 12:27:31.839505202 +0000 UTC m=+1138.389210967" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.562922 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.724009 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-combined-ca-bundle\") pod \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.724190 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-config\") pod \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.724234 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfpxw\" (UniqueName: \"kubernetes.io/projected/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-kube-api-access-pfpxw\") pod \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.724251 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-httpd-config\") pod \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.724272 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-ovndb-tls-certs\") pod \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\" (UID: \"c09ee4a5-59be-4a2c-8701-a572cc9a71ec\") " Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.733742 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c09ee4a5-59be-4a2c-8701-a572cc9a71ec" (UID: "c09ee4a5-59be-4a2c-8701-a572cc9a71ec"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.739471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-kube-api-access-pfpxw" (OuterVolumeSpecName: "kube-api-access-pfpxw") pod "c09ee4a5-59be-4a2c-8701-a572cc9a71ec" (UID: "c09ee4a5-59be-4a2c-8701-a572cc9a71ec"). InnerVolumeSpecName "kube-api-access-pfpxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.793562 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-config" (OuterVolumeSpecName: "config") pod "c09ee4a5-59be-4a2c-8701-a572cc9a71ec" (UID: "c09ee4a5-59be-4a2c-8701-a572cc9a71ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.821934 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c09ee4a5-59be-4a2c-8701-a572cc9a71ec" (UID: "c09ee4a5-59be-4a2c-8701-a572cc9a71ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.828733 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.828755 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.828765 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfpxw\" (UniqueName: \"kubernetes.io/projected/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-kube-api-access-pfpxw\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.828775 4892 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.833755 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c09ee4a5-59be-4a2c-8701-a572cc9a71ec" (UID: "c09ee4a5-59be-4a2c-8701-a572cc9a71ec"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.840088 4892 generic.go:334] "Generic (PLEG): container finished" podID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerID="bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6" exitCode=0 Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.841774 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68f547b878-th9sd" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.845310 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f547b878-th9sd" event={"ID":"c09ee4a5-59be-4a2c-8701-a572cc9a71ec","Type":"ContainerDied","Data":"bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6"} Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.845375 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68f547b878-th9sd" event={"ID":"c09ee4a5-59be-4a2c-8701-a572cc9a71ec","Type":"ContainerDied","Data":"506c8d5bf728c9326e92e31e86810cc723543100204fc5b35f67e756b96bff03"} Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.845398 4892 scope.go:117] "RemoveContainer" containerID="8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.847560 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="ceilometer-central-agent" containerID="cri-o://e65a89b094c281393de9f17af404aa8e3f0d25c8c1096b8fe524d160ae7639ef" gracePeriod=30 Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.847660 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="sg-core" containerID="cri-o://9a76eb98d87027ad0183d5ee6923cb22f1c5e77badf7d4668d781e7b2cc4e26f" gracePeriod=30 Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.847715 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="proxy-httpd" containerID="cri-o://0b6a38dee4ffab656714bcea9673d2773cc49a33c0f21efb644743991165dad6" gracePeriod=30 Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.847767 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="ceilometer-notification-agent" containerID="cri-o://1f345ce56d6cf1b21a442c68ba997d821ef6f93d18ac598c8bf2399c9479bb8d" gracePeriod=30 Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.884971 4892 scope.go:117] "RemoveContainer" containerID="bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.906798 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68f547b878-th9sd"] Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.919823 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-68f547b878-th9sd"] Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.929490 4892 scope.go:117] "RemoveContainer" containerID="8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.930548 4892 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c09ee4a5-59be-4a2c-8701-a572cc9a71ec-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:32 crc kubenswrapper[4892]: E1006 12:27:32.930565 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8\": container with ID starting with 8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8 not found: ID does not exist" containerID="8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.931390 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8"} err="failed to get container status \"8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8\": rpc error: code = NotFound desc = could not find container \"8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8\": container with ID starting with 8508c1e1e60f248ef6423acffb2596ff99a27768361241bd0ded61fedbe64cb8 not found: ID does not exist" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.931412 4892 scope.go:117] "RemoveContainer" containerID="bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6" Oct 06 12:27:32 crc kubenswrapper[4892]: E1006 12:27:32.931968 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6\": container with ID starting with bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6 not found: ID does not exist" containerID="bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6" Oct 06 12:27:32 crc kubenswrapper[4892]: I1006 12:27:32.932010 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6"} err="failed to get container status \"bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6\": rpc error: code = NotFound desc = could not find container \"bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6\": container with ID starting with bb0b76b56a4ca7608399af1a5ac9e5bd3f775f82d8e02929943ef889d21bf9c6 not found: ID does not exist" Oct 06 12:27:32 crc kubenswrapper[4892]: E1006 12:27:32.957992 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc09ee4a5_59be_4a2c_8701_a572cc9a71ec.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc09ee4a5_59be_4a2c_8701_a572cc9a71ec.slice/crio-506c8d5bf728c9326e92e31e86810cc723543100204fc5b35f67e756b96bff03\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dfba438_0442_4bbc_a2ac_723505ee5984.slice/crio-conmon-9a76eb98d87027ad0183d5ee6923cb22f1c5e77badf7d4668d781e7b2cc4e26f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dfba438_0442_4bbc_a2ac_723505ee5984.slice/crio-0b6a38dee4ffab656714bcea9673d2773cc49a33c0f21efb644743991165dad6.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.116560 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.298983 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7bd-account-create-xgwv5" Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.450994 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tvwm\" (UniqueName: \"kubernetes.io/projected/4541a67b-e69c-4650-9de1-db5abe24d73b-kube-api-access-2tvwm\") pod \"4541a67b-e69c-4650-9de1-db5abe24d73b\" (UID: \"4541a67b-e69c-4650-9de1-db5abe24d73b\") " Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.470258 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4541a67b-e69c-4650-9de1-db5abe24d73b-kube-api-access-2tvwm" (OuterVolumeSpecName: "kube-api-access-2tvwm") pod "4541a67b-e69c-4650-9de1-db5abe24d73b" (UID: "4541a67b-e69c-4650-9de1-db5abe24d73b"). InnerVolumeSpecName "kube-api-access-2tvwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.553286 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tvwm\" (UniqueName: \"kubernetes.io/projected/4541a67b-e69c-4650-9de1-db5abe24d73b-kube-api-access-2tvwm\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.864270 4892 generic.go:334] "Generic (PLEG): container finished" podID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerID="0b6a38dee4ffab656714bcea9673d2773cc49a33c0f21efb644743991165dad6" exitCode=0 Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.864310 4892 generic.go:334] "Generic (PLEG): container finished" podID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerID="9a76eb98d87027ad0183d5ee6923cb22f1c5e77badf7d4668d781e7b2cc4e26f" exitCode=2 Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.864332 4892 generic.go:334] "Generic (PLEG): container finished" podID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerID="1f345ce56d6cf1b21a442c68ba997d821ef6f93d18ac598c8bf2399c9479bb8d" exitCode=0 Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.864340 4892 generic.go:334] "Generic (PLEG): container finished" podID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerID="e65a89b094c281393de9f17af404aa8e3f0d25c8c1096b8fe524d160ae7639ef" exitCode=0 Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.864373 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerDied","Data":"0b6a38dee4ffab656714bcea9673d2773cc49a33c0f21efb644743991165dad6"} Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.864444 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerDied","Data":"9a76eb98d87027ad0183d5ee6923cb22f1c5e77badf7d4668d781e7b2cc4e26f"} Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.864461 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerDied","Data":"1f345ce56d6cf1b21a442c68ba997d821ef6f93d18ac598c8bf2399c9479bb8d"} Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.864474 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerDied","Data":"e65a89b094c281393de9f17af404aa8e3f0d25c8c1096b8fe524d160ae7639ef"} Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.867779 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c7bd-account-create-xgwv5" event={"ID":"4541a67b-e69c-4650-9de1-db5abe24d73b","Type":"ContainerDied","Data":"590ca4b5412da03f2a1d45548589211afeed9b47b9cdd58f91b4ca0bdcc7e20c"} Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.867827 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="590ca4b5412da03f2a1d45548589211afeed9b47b9cdd58f91b4ca0bdcc7e20c" Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.867839 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c7bd-account-create-xgwv5" Oct 06 12:27:33 crc kubenswrapper[4892]: I1006 12:27:33.989036 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.168503 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-log-httpd\") pod \"4dfba438-0442-4bbc-a2ac-723505ee5984\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.168872 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-combined-ca-bundle\") pod \"4dfba438-0442-4bbc-a2ac-723505ee5984\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.168911 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bsx4\" (UniqueName: \"kubernetes.io/projected/4dfba438-0442-4bbc-a2ac-723505ee5984-kube-api-access-7bsx4\") pod \"4dfba438-0442-4bbc-a2ac-723505ee5984\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.168976 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-config-data\") pod \"4dfba438-0442-4bbc-a2ac-723505ee5984\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.169005 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-scripts\") pod \"4dfba438-0442-4bbc-a2ac-723505ee5984\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.169049 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-run-httpd\") pod \"4dfba438-0442-4bbc-a2ac-723505ee5984\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.169154 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-sg-core-conf-yaml\") pod \"4dfba438-0442-4bbc-a2ac-723505ee5984\" (UID: \"4dfba438-0442-4bbc-a2ac-723505ee5984\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.170249 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4dfba438-0442-4bbc-a2ac-723505ee5984" (UID: "4dfba438-0442-4bbc-a2ac-723505ee5984"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.171701 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4dfba438-0442-4bbc-a2ac-723505ee5984" (UID: "4dfba438-0442-4bbc-a2ac-723505ee5984"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.180235 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" path="/var/lib/kubelet/pods/c09ee4a5-59be-4a2c-8701-a572cc9a71ec/volumes" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.182781 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dfba438-0442-4bbc-a2ac-723505ee5984-kube-api-access-7bsx4" (OuterVolumeSpecName: "kube-api-access-7bsx4") pod "4dfba438-0442-4bbc-a2ac-723505ee5984" (UID: "4dfba438-0442-4bbc-a2ac-723505ee5984"). InnerVolumeSpecName "kube-api-access-7bsx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.188712 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-scripts" (OuterVolumeSpecName: "scripts") pod "4dfba438-0442-4bbc-a2ac-723505ee5984" (UID: "4dfba438-0442-4bbc-a2ac-723505ee5984"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.204719 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4dfba438-0442-4bbc-a2ac-723505ee5984" (UID: "4dfba438-0442-4bbc-a2ac-723505ee5984"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.276769 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.277291 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bsx4\" (UniqueName: \"kubernetes.io/projected/4dfba438-0442-4bbc-a2ac-723505ee5984-kube-api-access-7bsx4\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.277404 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.277469 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dfba438-0442-4bbc-a2ac-723505ee5984-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.277537 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.294378 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dfba438-0442-4bbc-a2ac-723505ee5984" (UID: "4dfba438-0442-4bbc-a2ac-723505ee5984"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.320468 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-config-data" (OuterVolumeSpecName: "config-data") pod "4dfba438-0442-4bbc-a2ac-723505ee5984" (UID: "4dfba438-0442-4bbc-a2ac-723505ee5984"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.378854 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.378885 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dfba438-0442-4bbc-a2ac-723505ee5984-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.650114 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.783966 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2xqj\" (UniqueName: \"kubernetes.io/projected/6950e2ad-e715-42cf-ae66-ac50bc683bf1-kube-api-access-p2xqj\") pod \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.784024 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data\") pod \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.784054 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-scripts\") pod \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.784091 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data-custom\") pod \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.784192 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-combined-ca-bundle\") pod \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.784272 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6950e2ad-e715-42cf-ae66-ac50bc683bf1-etc-machine-id\") pod \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\" (UID: \"6950e2ad-e715-42cf-ae66-ac50bc683bf1\") " Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.784635 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6950e2ad-e715-42cf-ae66-ac50bc683bf1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6950e2ad-e715-42cf-ae66-ac50bc683bf1" (UID: "6950e2ad-e715-42cf-ae66-ac50bc683bf1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.789424 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6950e2ad-e715-42cf-ae66-ac50bc683bf1-kube-api-access-p2xqj" (OuterVolumeSpecName: "kube-api-access-p2xqj") pod "6950e2ad-e715-42cf-ae66-ac50bc683bf1" (UID: "6950e2ad-e715-42cf-ae66-ac50bc683bf1"). InnerVolumeSpecName "kube-api-access-p2xqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.789873 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6950e2ad-e715-42cf-ae66-ac50bc683bf1" (UID: "6950e2ad-e715-42cf-ae66-ac50bc683bf1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.790507 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-scripts" (OuterVolumeSpecName: "scripts") pod "6950e2ad-e715-42cf-ae66-ac50bc683bf1" (UID: "6950e2ad-e715-42cf-ae66-ac50bc683bf1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.867310 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6950e2ad-e715-42cf-ae66-ac50bc683bf1" (UID: "6950e2ad-e715-42cf-ae66-ac50bc683bf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.886377 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2xqj\" (UniqueName: \"kubernetes.io/projected/6950e2ad-e715-42cf-ae66-ac50bc683bf1-kube-api-access-p2xqj\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.886408 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.886418 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.886433 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.886441 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6950e2ad-e715-42cf-ae66-ac50bc683bf1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.889986 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dfba438-0442-4bbc-a2ac-723505ee5984","Type":"ContainerDied","Data":"8b0469c61f02d91f0410cd850ff06ca65f42c7e27be8227cdcdf7aad09fc8958"} Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.890008 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.890036 4892 scope.go:117] "RemoveContainer" containerID="0b6a38dee4ffab656714bcea9673d2773cc49a33c0f21efb644743991165dad6" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.894128 4892 generic.go:334] "Generic (PLEG): container finished" podID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerID="9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8" exitCode=0 Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.894161 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6950e2ad-e715-42cf-ae66-ac50bc683bf1","Type":"ContainerDied","Data":"9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8"} Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.894183 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6950e2ad-e715-42cf-ae66-ac50bc683bf1","Type":"ContainerDied","Data":"daad2bb6da6e0494a1a0e813b1a3a15341ee3e95eedaf56a9439ba5a13abdb30"} Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.894243 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.925223 4892 scope.go:117] "RemoveContainer" containerID="9a76eb98d87027ad0183d5ee6923cb22f1c5e77badf7d4668d781e7b2cc4e26f" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.948449 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.948679 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data" (OuterVolumeSpecName: "config-data") pod "6950e2ad-e715-42cf-ae66-ac50bc683bf1" (UID: "6950e2ad-e715-42cf-ae66-ac50bc683bf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.955808 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.955929 4892 scope.go:117] "RemoveContainer" containerID="1f345ce56d6cf1b21a442c68ba997d821ef6f93d18ac598c8bf2399c9479bb8d" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965425 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965804 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="probe" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965821 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="probe" Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965850 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerName="neutron-api" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965856 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerName="neutron-api" Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965866 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="ceilometer-central-agent" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965874 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="ceilometer-central-agent" Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965885 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerName="neutron-httpd" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965891 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerName="neutron-httpd" Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965903 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="proxy-httpd" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965911 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="proxy-httpd" Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965926 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="ceilometer-notification-agent" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965932 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="ceilometer-notification-agent" Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965944 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="sg-core" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965950 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="sg-core" Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965960 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4541a67b-e69c-4650-9de1-db5abe24d73b" containerName="mariadb-account-create" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965966 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4541a67b-e69c-4650-9de1-db5abe24d73b" containerName="mariadb-account-create" Oct 06 12:27:34 crc kubenswrapper[4892]: E1006 12:27:34.965977 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="cinder-scheduler" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.965983 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="cinder-scheduler" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966174 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="ceilometer-notification-agent" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966189 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerName="neutron-httpd" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966202 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="cinder-scheduler" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966209 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" containerName="probe" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966225 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4541a67b-e69c-4650-9de1-db5abe24d73b" containerName="mariadb-account-create" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966235 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="ceilometer-central-agent" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966244 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="sg-core" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966253 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09ee4a5-59be-4a2c-8701-a572cc9a71ec" containerName="neutron-api" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.966259 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" containerName="proxy-httpd" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.968092 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.975406 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.976337 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.984268 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.988212 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6950e2ad-e715-42cf-ae66-ac50bc683bf1-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:34 crc kubenswrapper[4892]: I1006 12:27:34.991516 4892 scope.go:117] "RemoveContainer" containerID="e65a89b094c281393de9f17af404aa8e3f0d25c8c1096b8fe524d160ae7639ef" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.021317 4892 scope.go:117] "RemoveContainer" containerID="05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.043269 4892 scope.go:117] "RemoveContainer" containerID="9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.059640 4892 scope.go:117] "RemoveContainer" containerID="05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849" Oct 06 12:27:35 crc kubenswrapper[4892]: E1006 12:27:35.060009 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849\": container with ID starting with 05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849 not found: ID does not exist" containerID="05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.060106 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849"} err="failed to get container status \"05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849\": rpc error: code = NotFound desc = could not find container \"05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849\": container with ID starting with 05fe9bc98bfd42de777110489ef1e540b87e5493680cc5bf1a580bad0e63b849 not found: ID does not exist" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.060175 4892 scope.go:117] "RemoveContainer" containerID="9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8" Oct 06 12:27:35 crc kubenswrapper[4892]: E1006 12:27:35.060582 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8\": container with ID starting with 9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8 not found: ID does not exist" containerID="9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.060626 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8"} err="failed to get container status \"9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8\": rpc error: code = NotFound desc = could not find container \"9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8\": container with ID starting with 9e0ac05de5537d3b0882b60618d672e931a38f287b2479b51d7d6549794a80e8 not found: ID does not exist" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.089729 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-config-data\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.089783 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.090316 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4hmf\" (UniqueName: \"kubernetes.io/projected/cfaa25f1-1ec7-470c-9660-7342674c6786-kube-api-access-m4hmf\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.090378 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-log-httpd\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.090445 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-run-httpd\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.090534 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-scripts\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.090615 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.192128 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-scripts\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.192307 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.192953 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-config-data\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.193005 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.193029 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4hmf\" (UniqueName: \"kubernetes.io/projected/cfaa25f1-1ec7-470c-9660-7342674c6786-kube-api-access-m4hmf\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.193097 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-log-httpd\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.193185 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-run-httpd\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.193645 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-run-httpd\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.195682 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-scripts\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.197173 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-config-data\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.197417 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.198475 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.206639 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-log-httpd\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.208888 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4hmf\" (UniqueName: \"kubernetes.io/projected/cfaa25f1-1ec7-470c-9660-7342674c6786-kube-api-access-m4hmf\") pod \"ceilometer-0\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.296053 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.304565 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.316870 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.357867 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.360625 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.363907 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.381507 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.501869 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2zn7\" (UniqueName: \"kubernetes.io/projected/22e3a220-0262-4414-a93c-0da5d9d8cce3-kube-api-access-q2zn7\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.502348 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22e3a220-0262-4414-a93c-0da5d9d8cce3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.502391 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.502430 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-scripts\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.502454 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-config-data\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.502557 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.605491 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-scripts\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.605551 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-config-data\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.605697 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.605837 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2zn7\" (UniqueName: \"kubernetes.io/projected/22e3a220-0262-4414-a93c-0da5d9d8cce3-kube-api-access-q2zn7\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.605901 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22e3a220-0262-4414-a93c-0da5d9d8cce3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.606034 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/22e3a220-0262-4414-a93c-0da5d9d8cce3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.605967 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.609376 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-scripts\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.610195 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-config-data\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.610759 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.613029 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22e3a220-0262-4414-a93c-0da5d9d8cce3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.627248 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2zn7\" (UniqueName: \"kubernetes.io/projected/22e3a220-0262-4414-a93c-0da5d9d8cce3-kube-api-access-q2zn7\") pod \"cinder-scheduler-0\" (UID: \"22e3a220-0262-4414-a93c-0da5d9d8cce3\") " pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.720749 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.795312 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.920115 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerStarted","Data":"6eb076ac63486bc44aae6f8e891acbde88392f2800134fadcdb321730019357d"} Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.923585 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.923642 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.937097 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.974955 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.979849 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:35 crc kubenswrapper[4892]: I1006 12:27:35.993102 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.179593 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dfba438-0442-4bbc-a2ac-723505ee5984" path="/var/lib/kubelet/pods/4dfba438-0442-4bbc-a2ac-723505ee5984/volumes" Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.180482 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6950e2ad-e715-42cf-ae66-ac50bc683bf1" path="/var/lib/kubelet/pods/6950e2ad-e715-42cf-ae66-ac50bc683bf1/volumes" Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.253174 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 06 12:27:36 crc kubenswrapper[4892]: W1006 12:27:36.257062 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22e3a220_0262_4414_a93c_0da5d9d8cce3.slice/crio-52b48490e06b9fef67388651e97a6c46fff34066d15bfc19dc272753cfe5f9e7 WatchSource:0}: Error finding container 52b48490e06b9fef67388651e97a6c46fff34066d15bfc19dc272753cfe5f9e7: Status 404 returned error can't find the container with id 52b48490e06b9fef67388651e97a6c46fff34066d15bfc19dc272753cfe5f9e7 Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.942393 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerStarted","Data":"da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648"} Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.942745 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerStarted","Data":"4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759"} Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.964406 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22e3a220-0262-4414-a93c-0da5d9d8cce3","Type":"ContainerStarted","Data":"144e54473eed41919c3ecddd23ab9ae06f7d85ccfec598d54a5116873ce90ffb"} Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.964500 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22e3a220-0262-4414-a93c-0da5d9d8cce3","Type":"ContainerStarted","Data":"52b48490e06b9fef67388651e97a6c46fff34066d15bfc19dc272753cfe5f9e7"} Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.964982 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.965038 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:36 crc kubenswrapper[4892]: I1006 12:27:36.975888 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 06 12:27:37 crc kubenswrapper[4892]: I1006 12:27:37.975006 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerStarted","Data":"bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07"} Oct 06 12:27:37 crc kubenswrapper[4892]: I1006 12:27:37.976734 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"22e3a220-0262-4414-a93c-0da5d9d8cce3","Type":"ContainerStarted","Data":"83f54f58896a9b320bfb74740b94bfe9b13ef3c3f8a345d2880e3b2b55da7822"} Oct 06 12:27:38 crc kubenswrapper[4892]: I1006 12:27:38.002881 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.002865297 podStartE2EDuration="3.002865297s" podCreationTimestamp="2025-10-06 12:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:27:37.996580624 +0000 UTC m=+1144.546286389" watchObservedRunningTime="2025-10-06 12:27:38.002865297 +0000 UTC m=+1144.552571062" Oct 06 12:27:38 crc kubenswrapper[4892]: E1006 12:27:38.060435 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c4118ce3a92264a7a65267fe65ae69f17c7790f9887cfafc5cb9301606868b4" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 06 12:27:38 crc kubenswrapper[4892]: E1006 12:27:38.061998 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c4118ce3a92264a7a65267fe65ae69f17c7790f9887cfafc5cb9301606868b4" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 06 12:27:38 crc kubenswrapper[4892]: E1006 12:27:38.063215 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c4118ce3a92264a7a65267fe65ae69f17c7790f9887cfafc5cb9301606868b4" cmd=["/usr/bin/pgrep","-f","-r","DRST","watcher-decision-engine"] Oct 06 12:27:38 crc kubenswrapper[4892]: E1006 12:27:38.063253 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-decision-engine-0" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:27:38 crc kubenswrapper[4892]: I1006 12:27:38.943561 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:38 crc kubenswrapper[4892]: I1006 12:27:38.948772 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 06 12:27:38 crc kubenswrapper[4892]: I1006 12:27:38.987362 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerStarted","Data":"5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be"} Oct 06 12:27:38 crc kubenswrapper[4892]: I1006 12:27:38.988249 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 12:27:39 crc kubenswrapper[4892]: I1006 12:27:39.015181 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.434311381 podStartE2EDuration="5.015163884s" podCreationTimestamp="2025-10-06 12:27:34 +0000 UTC" firstStartedPulling="2025-10-06 12:27:35.809853718 +0000 UTC m=+1142.359559483" lastFinishedPulling="2025-10-06 12:27:38.390706221 +0000 UTC m=+1144.940411986" observedRunningTime="2025-10-06 12:27:39.003460144 +0000 UTC m=+1145.553165909" watchObservedRunningTime="2025-10-06 12:27:39.015163884 +0000 UTC m=+1145.564869659" Oct 06 12:27:39 crc kubenswrapper[4892]: I1006 12:27:39.401085 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 12:27:39 crc kubenswrapper[4892]: I1006 12:27:39.401140 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 06 12:27:39 crc kubenswrapper[4892]: I1006 12:27:39.437144 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 12:27:39 crc kubenswrapper[4892]: I1006 12:27:39.451220 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 06 12:27:39 crc kubenswrapper[4892]: I1006 12:27:39.995353 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 12:27:39 crc kubenswrapper[4892]: I1006 12:27:39.995403 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.536877 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-2416-account-create-8kmnp"] Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.539314 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2416-account-create-8kmnp" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.542590 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.570198 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2416-account-create-8kmnp"] Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.601895 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm585\" (UniqueName: \"kubernetes.io/projected/a1cfcdc0-6f22-4778-9a8f-c050eba27482-kube-api-access-cm585\") pod \"nova-cell0-2416-account-create-8kmnp\" (UID: \"a1cfcdc0-6f22-4778-9a8f-c050eba27482\") " pod="openstack/nova-cell0-2416-account-create-8kmnp" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.703317 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm585\" (UniqueName: \"kubernetes.io/projected/a1cfcdc0-6f22-4778-9a8f-c050eba27482-kube-api-access-cm585\") pod \"nova-cell0-2416-account-create-8kmnp\" (UID: \"a1cfcdc0-6f22-4778-9a8f-c050eba27482\") " pod="openstack/nova-cell0-2416-account-create-8kmnp" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.721419 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.722011 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm585\" (UniqueName: \"kubernetes.io/projected/a1cfcdc0-6f22-4778-9a8f-c050eba27482-kube-api-access-cm585\") pod \"nova-cell0-2416-account-create-8kmnp\" (UID: \"a1cfcdc0-6f22-4778-9a8f-c050eba27482\") " pod="openstack/nova-cell0-2416-account-create-8kmnp" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.740032 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9231-account-create-pt6vz"] Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.741350 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9231-account-create-pt6vz" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.744343 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.773183 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9231-account-create-pt6vz"] Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.804829 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ltdp\" (UniqueName: \"kubernetes.io/projected/34fa2c05-cfbf-4c6d-ad52-01568460df84-kube-api-access-4ltdp\") pod \"nova-cell1-9231-account-create-pt6vz\" (UID: \"34fa2c05-cfbf-4c6d-ad52-01568460df84\") " pod="openstack/nova-cell1-9231-account-create-pt6vz" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.875405 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2416-account-create-8kmnp" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.909963 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ltdp\" (UniqueName: \"kubernetes.io/projected/34fa2c05-cfbf-4c6d-ad52-01568460df84-kube-api-access-4ltdp\") pod \"nova-cell1-9231-account-create-pt6vz\" (UID: \"34fa2c05-cfbf-4c6d-ad52-01568460df84\") " pod="openstack/nova-cell1-9231-account-create-pt6vz" Oct 06 12:27:40 crc kubenswrapper[4892]: I1006 12:27:40.934074 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ltdp\" (UniqueName: \"kubernetes.io/projected/34fa2c05-cfbf-4c6d-ad52-01568460df84-kube-api-access-4ltdp\") pod \"nova-cell1-9231-account-create-pt6vz\" (UID: \"34fa2c05-cfbf-4c6d-ad52-01568460df84\") " pod="openstack/nova-cell1-9231-account-create-pt6vz" Oct 06 12:27:41 crc kubenswrapper[4892]: I1006 12:27:41.089524 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9231-account-create-pt6vz" Oct 06 12:27:41 crc kubenswrapper[4892]: I1006 12:27:41.351166 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-2416-account-create-8kmnp"] Oct 06 12:27:41 crc kubenswrapper[4892]: I1006 12:27:41.582893 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9231-account-create-pt6vz"] Oct 06 12:27:41 crc kubenswrapper[4892]: W1006 12:27:41.584092 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34fa2c05_cfbf_4c6d_ad52_01568460df84.slice/crio-053d85a303594e58bdc2625b21c9387b79fa664278fb9753ba0e35a3eb73c95c WatchSource:0}: Error finding container 053d85a303594e58bdc2625b21c9387b79fa664278fb9753ba0e35a3eb73c95c: Status 404 returned error can't find the container with id 053d85a303594e58bdc2625b21c9387b79fa664278fb9753ba0e35a3eb73c95c Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.019792 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1cfcdc0-6f22-4778-9a8f-c050eba27482" containerID="41e82361141e836db8e5cf4ba79ef7f46f7ed4e27fb5f38df914c6371d2a99c9" exitCode=0 Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.019922 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2416-account-create-8kmnp" event={"ID":"a1cfcdc0-6f22-4778-9a8f-c050eba27482","Type":"ContainerDied","Data":"41e82361141e836db8e5cf4ba79ef7f46f7ed4e27fb5f38df914c6371d2a99c9"} Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.020282 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2416-account-create-8kmnp" event={"ID":"a1cfcdc0-6f22-4778-9a8f-c050eba27482","Type":"ContainerStarted","Data":"e2c1018ca262da2bd343336488f68d0fac3e052f8ced5d4766d7e8e20cead5a9"} Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.021629 4892 generic.go:334] "Generic (PLEG): container finished" podID="34fa2c05-cfbf-4c6d-ad52-01568460df84" containerID="5d57a9dcabefb22c4ddb614852f10e80b30da8b8e557bece3963c3c33a2f8296" exitCode=0 Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.021674 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9231-account-create-pt6vz" event={"ID":"34fa2c05-cfbf-4c6d-ad52-01568460df84","Type":"ContainerDied","Data":"5d57a9dcabefb22c4ddb614852f10e80b30da8b8e557bece3963c3c33a2f8296"} Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.021698 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9231-account-create-pt6vz" event={"ID":"34fa2c05-cfbf-4c6d-ad52-01568460df84","Type":"ContainerStarted","Data":"053d85a303594e58bdc2625b21c9387b79fa664278fb9753ba0e35a3eb73c95c"} Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.068278 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.068427 4892 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 06 12:27:42 crc kubenswrapper[4892]: I1006 12:27:42.070639 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.074011 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.074242 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="ceilometer-central-agent" containerID="cri-o://4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759" gracePeriod=30 Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.075260 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="proxy-httpd" containerID="cri-o://5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be" gracePeriod=30 Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.075698 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="ceilometer-notification-agent" containerID="cri-o://da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648" gracePeriod=30 Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.075740 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="sg-core" containerID="cri-o://bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07" gracePeriod=30 Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.640360 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9231-account-create-pt6vz" Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.648124 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2416-account-create-8kmnp" Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.812219 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm585\" (UniqueName: \"kubernetes.io/projected/a1cfcdc0-6f22-4778-9a8f-c050eba27482-kube-api-access-cm585\") pod \"a1cfcdc0-6f22-4778-9a8f-c050eba27482\" (UID: \"a1cfcdc0-6f22-4778-9a8f-c050eba27482\") " Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.812616 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ltdp\" (UniqueName: \"kubernetes.io/projected/34fa2c05-cfbf-4c6d-ad52-01568460df84-kube-api-access-4ltdp\") pod \"34fa2c05-cfbf-4c6d-ad52-01568460df84\" (UID: \"34fa2c05-cfbf-4c6d-ad52-01568460df84\") " Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.819256 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1cfcdc0-6f22-4778-9a8f-c050eba27482-kube-api-access-cm585" (OuterVolumeSpecName: "kube-api-access-cm585") pod "a1cfcdc0-6f22-4778-9a8f-c050eba27482" (UID: "a1cfcdc0-6f22-4778-9a8f-c050eba27482"). InnerVolumeSpecName "kube-api-access-cm585". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.819482 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34fa2c05-cfbf-4c6d-ad52-01568460df84-kube-api-access-4ltdp" (OuterVolumeSpecName: "kube-api-access-4ltdp") pod "34fa2c05-cfbf-4c6d-ad52-01568460df84" (UID: "34fa2c05-cfbf-4c6d-ad52-01568460df84"). InnerVolumeSpecName "kube-api-access-4ltdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.915067 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ltdp\" (UniqueName: \"kubernetes.io/projected/34fa2c05-cfbf-4c6d-ad52-01568460df84-kube-api-access-4ltdp\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:43 crc kubenswrapper[4892]: I1006 12:27:43.915092 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm585\" (UniqueName: \"kubernetes.io/projected/a1cfcdc0-6f22-4778-9a8f-c050eba27482-kube-api-access-cm585\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.042220 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-2416-account-create-8kmnp" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.042220 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-2416-account-create-8kmnp" event={"ID":"a1cfcdc0-6f22-4778-9a8f-c050eba27482","Type":"ContainerDied","Data":"e2c1018ca262da2bd343336488f68d0fac3e052f8ced5d4766d7e8e20cead5a9"} Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.042361 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2c1018ca262da2bd343336488f68d0fac3e052f8ced5d4766d7e8e20cead5a9" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.044457 4892 generic.go:334] "Generic (PLEG): container finished" podID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerID="5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be" exitCode=0 Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.044477 4892 generic.go:334] "Generic (PLEG): container finished" podID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerID="bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07" exitCode=2 Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.044479 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerDied","Data":"5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be"} Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.044507 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerDied","Data":"bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07"} Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.044538 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerDied","Data":"4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759"} Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.044486 4892 generic.go:334] "Generic (PLEG): container finished" podID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerID="4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759" exitCode=0 Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.045992 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9231-account-create-pt6vz" event={"ID":"34fa2c05-cfbf-4c6d-ad52-01568460df84","Type":"ContainerDied","Data":"053d85a303594e58bdc2625b21c9387b79fa664278fb9753ba0e35a3eb73c95c"} Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.046013 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053d85a303594e58bdc2625b21c9387b79fa664278fb9753ba0e35a3eb73c95c" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.046051 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9231-account-create-pt6vz" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.621102 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.728066 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-scripts\") pod \"cfaa25f1-1ec7-470c-9660-7342674c6786\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.728304 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-run-httpd\") pod \"cfaa25f1-1ec7-470c-9660-7342674c6786\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.728350 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4hmf\" (UniqueName: \"kubernetes.io/projected/cfaa25f1-1ec7-470c-9660-7342674c6786-kube-api-access-m4hmf\") pod \"cfaa25f1-1ec7-470c-9660-7342674c6786\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.728380 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-sg-core-conf-yaml\") pod \"cfaa25f1-1ec7-470c-9660-7342674c6786\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.728447 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-config-data\") pod \"cfaa25f1-1ec7-470c-9660-7342674c6786\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.728479 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-combined-ca-bundle\") pod \"cfaa25f1-1ec7-470c-9660-7342674c6786\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.728494 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-log-httpd\") pod \"cfaa25f1-1ec7-470c-9660-7342674c6786\" (UID: \"cfaa25f1-1ec7-470c-9660-7342674c6786\") " Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.729389 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cfaa25f1-1ec7-470c-9660-7342674c6786" (UID: "cfaa25f1-1ec7-470c-9660-7342674c6786"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.729709 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cfaa25f1-1ec7-470c-9660-7342674c6786" (UID: "cfaa25f1-1ec7-470c-9660-7342674c6786"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.733795 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-scripts" (OuterVolumeSpecName: "scripts") pod "cfaa25f1-1ec7-470c-9660-7342674c6786" (UID: "cfaa25f1-1ec7-470c-9660-7342674c6786"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.743461 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfaa25f1-1ec7-470c-9660-7342674c6786-kube-api-access-m4hmf" (OuterVolumeSpecName: "kube-api-access-m4hmf") pod "cfaa25f1-1ec7-470c-9660-7342674c6786" (UID: "cfaa25f1-1ec7-470c-9660-7342674c6786"). InnerVolumeSpecName "kube-api-access-m4hmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.765067 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cfaa25f1-1ec7-470c-9660-7342674c6786" (UID: "cfaa25f1-1ec7-470c-9660-7342674c6786"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.830213 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.830237 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4hmf\" (UniqueName: \"kubernetes.io/projected/cfaa25f1-1ec7-470c-9660-7342674c6786-kube-api-access-m4hmf\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.830247 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.830255 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cfaa25f1-1ec7-470c-9660-7342674c6786-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.830262 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.831410 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfaa25f1-1ec7-470c-9660-7342674c6786" (UID: "cfaa25f1-1ec7-470c-9660-7342674c6786"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.856471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-config-data" (OuterVolumeSpecName: "config-data") pod "cfaa25f1-1ec7-470c-9660-7342674c6786" (UID: "cfaa25f1-1ec7-470c-9660-7342674c6786"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.932441 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:44 crc kubenswrapper[4892]: I1006 12:27:44.932475 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfaa25f1-1ec7-470c-9660-7342674c6786-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.059536 4892 generic.go:334] "Generic (PLEG): container finished" podID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerID="da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648" exitCode=0 Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.059577 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerDied","Data":"da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648"} Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.059604 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cfaa25f1-1ec7-470c-9660-7342674c6786","Type":"ContainerDied","Data":"6eb076ac63486bc44aae6f8e891acbde88392f2800134fadcdb321730019357d"} Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.059620 4892 scope.go:117] "RemoveContainer" containerID="5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.059744 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.089764 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.108461 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.111762 4892 scope.go:117] "RemoveContainer" containerID="bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.117050 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.118791 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fa2c05-cfbf-4c6d-ad52-01568460df84" containerName="mariadb-account-create" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.118811 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fa2c05-cfbf-4c6d-ad52-01568460df84" containerName="mariadb-account-create" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.118837 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="proxy-httpd" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.118844 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="proxy-httpd" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.118852 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="ceilometer-central-agent" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.118888 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="ceilometer-central-agent" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.118914 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="sg-core" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.118920 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="sg-core" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.118933 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1cfcdc0-6f22-4778-9a8f-c050eba27482" containerName="mariadb-account-create" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.118939 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cfcdc0-6f22-4778-9a8f-c050eba27482" containerName="mariadb-account-create" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.118950 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="ceilometer-notification-agent" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.118956 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="ceilometer-notification-agent" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.119129 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1cfcdc0-6f22-4778-9a8f-c050eba27482" containerName="mariadb-account-create" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.119142 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="ceilometer-notification-agent" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.119161 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="34fa2c05-cfbf-4c6d-ad52-01568460df84" containerName="mariadb-account-create" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.119176 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="proxy-httpd" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.119195 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="ceilometer-central-agent" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.119211 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" containerName="sg-core" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.121022 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.128220 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.128994 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.130904 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.140308 4892 scope.go:117] "RemoveContainer" containerID="da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.141234 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.141270 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmjt\" (UniqueName: \"kubernetes.io/projected/320898d4-f2c7-4164-830b-ee3d96e34d28-kube-api-access-vfmjt\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.141308 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-run-httpd\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.141436 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.141511 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-scripts\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.141540 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-config-data\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.141584 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-log-httpd\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.161767 4892 scope.go:117] "RemoveContainer" containerID="4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.183465 4892 scope.go:117] "RemoveContainer" containerID="5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.183837 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be\": container with ID starting with 5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be not found: ID does not exist" containerID="5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.183881 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be"} err="failed to get container status \"5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be\": rpc error: code = NotFound desc = could not find container \"5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be\": container with ID starting with 5e669460eefc9179942c7c196d51c23c486c8cd75fa1526be17adb5f7059c3be not found: ID does not exist" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.183909 4892 scope.go:117] "RemoveContainer" containerID="bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.184373 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07\": container with ID starting with bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07 not found: ID does not exist" containerID="bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.184400 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07"} err="failed to get container status \"bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07\": rpc error: code = NotFound desc = could not find container \"bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07\": container with ID starting with bf054ad30d1c262746fda911b92765f74b89f6f350d6b4e907f9c19ca824ba07 not found: ID does not exist" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.184413 4892 scope.go:117] "RemoveContainer" containerID="da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.185111 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648\": container with ID starting with da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648 not found: ID does not exist" containerID="da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.185145 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648"} err="failed to get container status \"da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648\": rpc error: code = NotFound desc = could not find container \"da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648\": container with ID starting with da1f8c52477810027f60f1f597cf7ac496f5789482aaa14905883b992d02c648 not found: ID does not exist" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.185169 4892 scope.go:117] "RemoveContainer" containerID="4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759" Oct 06 12:27:45 crc kubenswrapper[4892]: E1006 12:27:45.185683 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759\": container with ID starting with 4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759 not found: ID does not exist" containerID="4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.185707 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759"} err="failed to get container status \"4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759\": rpc error: code = NotFound desc = could not find container \"4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759\": container with ID starting with 4b8ac45a9af706e332d312d3d289453800275d89ae2cf784f5d29b5a5ea3c759 not found: ID does not exist" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.243547 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-scripts\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.243599 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-config-data\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.243622 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-log-httpd\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.243664 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.243685 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmjt\" (UniqueName: \"kubernetes.io/projected/320898d4-f2c7-4164-830b-ee3d96e34d28-kube-api-access-vfmjt\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.243720 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-run-httpd\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.243807 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.244092 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-log-httpd\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.244423 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-run-httpd\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.247606 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.247643 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-scripts\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.248683 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.254914 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-config-data\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.262757 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmjt\" (UniqueName: \"kubernetes.io/projected/320898d4-f2c7-4164-830b-ee3d96e34d28-kube-api-access-vfmjt\") pod \"ceilometer-0\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.450004 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.715443 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmjq4"] Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.717221 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.720981 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.721210 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hngwc" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.726588 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.732609 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmjq4"] Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.758773 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-config-data\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.758837 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hjlf\" (UniqueName: \"kubernetes.io/projected/812d3ada-a315-4c39-9f56-bb54525a0df2-kube-api-access-4hjlf\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.758892 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.759075 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-scripts\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.863067 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-config-data\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.863449 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hjlf\" (UniqueName: \"kubernetes.io/projected/812d3ada-a315-4c39-9f56-bb54525a0df2-kube-api-access-4hjlf\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.863525 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.863616 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-scripts\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.869116 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-scripts\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.870108 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.871920 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-config-data\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.885908 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hjlf\" (UniqueName: \"kubernetes.io/projected/812d3ada-a315-4c39-9f56-bb54525a0df2-kube-api-access-4hjlf\") pod \"nova-cell0-conductor-db-sync-fmjq4\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.937037 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 06 12:27:45 crc kubenswrapper[4892]: I1006 12:27:45.955925 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:46 crc kubenswrapper[4892]: I1006 12:27:46.038798 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:27:46 crc kubenswrapper[4892]: I1006 12:27:46.079028 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerStarted","Data":"2bab45f02506186ff7a6763168b743907294000b11e4faec3a64df9db62b67d3"} Oct 06 12:27:46 crc kubenswrapper[4892]: I1006 12:27:46.190339 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfaa25f1-1ec7-470c-9660-7342674c6786" path="/var/lib/kubelet/pods/cfaa25f1-1ec7-470c-9660-7342674c6786/volumes" Oct 06 12:27:46 crc kubenswrapper[4892]: I1006 12:27:46.559819 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmjq4"] Oct 06 12:27:47 crc kubenswrapper[4892]: I1006 12:27:47.096825 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerStarted","Data":"d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0"} Oct 06 12:27:47 crc kubenswrapper[4892]: I1006 12:27:47.097133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerStarted","Data":"387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa"} Oct 06 12:27:47 crc kubenswrapper[4892]: I1006 12:27:47.098615 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" event={"ID":"812d3ada-a315-4c39-9f56-bb54525a0df2","Type":"ContainerStarted","Data":"4b4a88301c46687c7222ef47cf6a352b4f46cdbca4b339709fc6fcda2f17d0d3"} Oct 06 12:27:48 crc kubenswrapper[4892]: I1006 12:27:48.111727 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerStarted","Data":"b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59"} Oct 06 12:27:49 crc kubenswrapper[4892]: I1006 12:27:49.125035 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerStarted","Data":"0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca"} Oct 06 12:27:49 crc kubenswrapper[4892]: I1006 12:27:49.125533 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 12:27:49 crc kubenswrapper[4892]: I1006 12:27:49.156520 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6086056709999998 podStartE2EDuration="4.156499786s" podCreationTimestamp="2025-10-06 12:27:45 +0000 UTC" firstStartedPulling="2025-10-06 12:27:45.957168614 +0000 UTC m=+1152.506874379" lastFinishedPulling="2025-10-06 12:27:48.505062729 +0000 UTC m=+1155.054768494" observedRunningTime="2025-10-06 12:27:49.14596485 +0000 UTC m=+1155.695670625" watchObservedRunningTime="2025-10-06 12:27:49.156499786 +0000 UTC m=+1155.706205551" Oct 06 12:27:55 crc kubenswrapper[4892]: I1006 12:27:55.629018 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.190:8776/healthcheck\": dial tcp 10.217.0.190:8776: connect: connection refused" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.206283 4892 generic.go:334] "Generic (PLEG): container finished" podID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerID="17513f38162f888ac8fe00ed7de663f015912c8bf24aefa05de2f13ad9a92af4" exitCode=137 Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.206829 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"455b9aa7-eeb4-40aa-ae93-872b577730d4","Type":"ContainerDied","Data":"17513f38162f888ac8fe00ed7de663f015912c8bf24aefa05de2f13ad9a92af4"} Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.661868 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.677627 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data-custom\") pod \"455b9aa7-eeb4-40aa-ae93-872b577730d4\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.677797 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbl25\" (UniqueName: \"kubernetes.io/projected/455b9aa7-eeb4-40aa-ae93-872b577730d4-kube-api-access-cbl25\") pod \"455b9aa7-eeb4-40aa-ae93-872b577730d4\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.677917 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/455b9aa7-eeb4-40aa-ae93-872b577730d4-logs\") pod \"455b9aa7-eeb4-40aa-ae93-872b577730d4\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.677960 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-scripts\") pod \"455b9aa7-eeb4-40aa-ae93-872b577730d4\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.678024 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data\") pod \"455b9aa7-eeb4-40aa-ae93-872b577730d4\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.678068 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-combined-ca-bundle\") pod \"455b9aa7-eeb4-40aa-ae93-872b577730d4\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.678173 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/455b9aa7-eeb4-40aa-ae93-872b577730d4-etc-machine-id\") pod \"455b9aa7-eeb4-40aa-ae93-872b577730d4\" (UID: \"455b9aa7-eeb4-40aa-ae93-872b577730d4\") " Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.678858 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/455b9aa7-eeb4-40aa-ae93-872b577730d4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "455b9aa7-eeb4-40aa-ae93-872b577730d4" (UID: "455b9aa7-eeb4-40aa-ae93-872b577730d4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.683116 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/455b9aa7-eeb4-40aa-ae93-872b577730d4-logs" (OuterVolumeSpecName: "logs") pod "455b9aa7-eeb4-40aa-ae93-872b577730d4" (UID: "455b9aa7-eeb4-40aa-ae93-872b577730d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.685220 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455b9aa7-eeb4-40aa-ae93-872b577730d4-kube-api-access-cbl25" (OuterVolumeSpecName: "kube-api-access-cbl25") pod "455b9aa7-eeb4-40aa-ae93-872b577730d4" (UID: "455b9aa7-eeb4-40aa-ae93-872b577730d4"). InnerVolumeSpecName "kube-api-access-cbl25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.689607 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-scripts" (OuterVolumeSpecName: "scripts") pod "455b9aa7-eeb4-40aa-ae93-872b577730d4" (UID: "455b9aa7-eeb4-40aa-ae93-872b577730d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.707492 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "455b9aa7-eeb4-40aa-ae93-872b577730d4" (UID: "455b9aa7-eeb4-40aa-ae93-872b577730d4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.730772 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "455b9aa7-eeb4-40aa-ae93-872b577730d4" (UID: "455b9aa7-eeb4-40aa-ae93-872b577730d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.774492 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data" (OuterVolumeSpecName: "config-data") pod "455b9aa7-eeb4-40aa-ae93-872b577730d4" (UID: "455b9aa7-eeb4-40aa-ae93-872b577730d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.782700 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbl25\" (UniqueName: \"kubernetes.io/projected/455b9aa7-eeb4-40aa-ae93-872b577730d4-kube-api-access-cbl25\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.782724 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/455b9aa7-eeb4-40aa-ae93-872b577730d4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.782733 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.782743 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.782752 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.782759 4892 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/455b9aa7-eeb4-40aa-ae93-872b577730d4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:56 crc kubenswrapper[4892]: I1006 12:27:56.782767 4892 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/455b9aa7-eeb4-40aa-ae93-872b577730d4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.219069 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"455b9aa7-eeb4-40aa-ae93-872b577730d4","Type":"ContainerDied","Data":"3fafb9db75ef842495305bcb5b017578f284048fd9effb8db5394d2e5b4c19bd"} Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.219097 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.219142 4892 scope.go:117] "RemoveContainer" containerID="17513f38162f888ac8fe00ed7de663f015912c8bf24aefa05de2f13ad9a92af4" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.222004 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" event={"ID":"812d3ada-a315-4c39-9f56-bb54525a0df2","Type":"ContainerStarted","Data":"40cb54a72745796db114d6a68e744b96d70ceaf7170abf1fa781b59cf666a7da"} Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.254078 4892 scope.go:117] "RemoveContainer" containerID="1142e5a9f6af62beb67e814f2f19a17657ea6b831ae2cadca200c1e14e6d7943" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.258288 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" podStartSLOduration=2.238927195 podStartE2EDuration="12.258270121s" podCreationTimestamp="2025-10-06 12:27:45 +0000 UTC" firstStartedPulling="2025-10-06 12:27:46.550835312 +0000 UTC m=+1153.100541077" lastFinishedPulling="2025-10-06 12:27:56.570178238 +0000 UTC m=+1163.119884003" observedRunningTime="2025-10-06 12:27:57.245896391 +0000 UTC m=+1163.795602156" watchObservedRunningTime="2025-10-06 12:27:57.258270121 +0000 UTC m=+1163.807975886" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.274552 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.296776 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.303877 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:57 crc kubenswrapper[4892]: E1006 12:27:57.304432 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.304453 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api" Oct 06 12:27:57 crc kubenswrapper[4892]: E1006 12:27:57.304474 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api-log" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.304482 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api-log" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.304778 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.304800 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" containerName="cinder-api-log" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.306075 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.308975 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.309146 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.309269 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.316513 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.395783 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.395824 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-scripts\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.396124 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.396173 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4rwp\" (UniqueName: \"kubernetes.io/projected/9f76a315-cedc-4ab9-9838-6a58823db3e2-kube-api-access-s4rwp\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.396211 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-config-data\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.396271 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.396369 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f76a315-cedc-4ab9-9838-6a58823db3e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.396516 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.396542 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f76a315-cedc-4ab9-9838-6a58823db3e2-logs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.498966 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499028 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4rwp\" (UniqueName: \"kubernetes.io/projected/9f76a315-cedc-4ab9-9838-6a58823db3e2-kube-api-access-s4rwp\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499061 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-config-data\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499101 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499141 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f76a315-cedc-4ab9-9838-6a58823db3e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499193 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499217 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f76a315-cedc-4ab9-9838-6a58823db3e2-logs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499249 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9f76a315-cedc-4ab9-9838-6a58823db3e2-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499265 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499290 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-scripts\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.499718 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f76a315-cedc-4ab9-9838-6a58823db3e2-logs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.504546 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-scripts\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.504756 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.506921 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.506951 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-config-data\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.513925 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-config-data-custom\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.516926 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f76a315-cedc-4ab9-9838-6a58823db3e2-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.523250 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4rwp\" (UniqueName: \"kubernetes.io/projected/9f76a315-cedc-4ab9-9838-6a58823db3e2-kube-api-access-s4rwp\") pod \"cinder-api-0\" (UID: \"9f76a315-cedc-4ab9-9838-6a58823db3e2\") " pod="openstack/cinder-api-0" Oct 06 12:27:57 crc kubenswrapper[4892]: I1006 12:27:57.628762 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.007914 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.008629 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="ceilometer-central-agent" containerID="cri-o://387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa" gracePeriod=30 Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.008697 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="proxy-httpd" containerID="cri-o://0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca" gracePeriod=30 Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.008777 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="sg-core" containerID="cri-o://b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59" gracePeriod=30 Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.008799 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="ceilometer-notification-agent" containerID="cri-o://d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0" gracePeriod=30 Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.017942 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.203:3000/\": EOF" Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.091348 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.190065 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455b9aa7-eeb4-40aa-ae93-872b577730d4" path="/var/lib/kubelet/pods/455b9aa7-eeb4-40aa-ae93-872b577730d4/volumes" Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.235047 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f76a315-cedc-4ab9-9838-6a58823db3e2","Type":"ContainerStarted","Data":"2987fba2c21adb3e3b77b985ad388f1dd48e3854591ce4622cf94a39ad03cc1f"} Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.237818 4892 generic.go:334] "Generic (PLEG): container finished" podID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerID="0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca" exitCode=0 Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.237845 4892 generic.go:334] "Generic (PLEG): container finished" podID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerID="b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59" exitCode=2 Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.237906 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerDied","Data":"0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca"} Oct 06 12:27:58 crc kubenswrapper[4892]: I1006 12:27:58.237964 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerDied","Data":"b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59"} Oct 06 12:27:59 crc kubenswrapper[4892]: I1006 12:27:59.252115 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f76a315-cedc-4ab9-9838-6a58823db3e2","Type":"ContainerStarted","Data":"45b4b771436165c406d5ab73040f9b871df0ff851b10fee9e8ae5d365a92a7ac"} Oct 06 12:27:59 crc kubenswrapper[4892]: I1006 12:27:59.256527 4892 generic.go:334] "Generic (PLEG): container finished" podID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerID="387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa" exitCode=0 Oct 06 12:27:59 crc kubenswrapper[4892]: I1006 12:27:59.256576 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerDied","Data":"387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa"} Oct 06 12:28:00 crc kubenswrapper[4892]: I1006 12:28:00.268997 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9f76a315-cedc-4ab9-9838-6a58823db3e2","Type":"ContainerStarted","Data":"c8b3b527b155562baa363c9532c71bfad17f0a06a14460b8d4f441f976d9a346"} Oct 06 12:28:00 crc kubenswrapper[4892]: I1006 12:28:00.269464 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 06 12:28:00 crc kubenswrapper[4892]: I1006 12:28:00.294972 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.294946135 podStartE2EDuration="3.294946135s" podCreationTimestamp="2025-10-06 12:27:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:00.286949383 +0000 UTC m=+1166.836655158" watchObservedRunningTime="2025-10-06 12:28:00.294946135 +0000 UTC m=+1166.844651930" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.280461 4892 generic.go:334] "Generic (PLEG): container finished" podID="6002d110-e634-47ab-b33b-652cbf7b3466" containerID="4c4118ce3a92264a7a65267fe65ae69f17c7790f9887cfafc5cb9301606868b4" exitCode=137 Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.280535 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerDied","Data":"4c4118ce3a92264a7a65267fe65ae69f17c7790f9887cfafc5cb9301606868b4"} Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.280891 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"6002d110-e634-47ab-b33b-652cbf7b3466","Type":"ContainerDied","Data":"185f3e3e519bd3ba5a5684f2a5af6ffbe2c957e6a288affcf5220fb3bba67df2"} Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.280902 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185f3e3e519bd3ba5a5684f2a5af6ffbe2c957e6a288affcf5220fb3bba67df2" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.280918 4892 scope.go:117] "RemoveContainer" containerID="c85ab94e8ec97268a14765817c99601fa0c5edc20a0cba739a4dfcd7ccbe6eab" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.344097 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.425628 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clbzv\" (UniqueName: \"kubernetes.io/projected/6002d110-e634-47ab-b33b-652cbf7b3466-kube-api-access-clbzv\") pod \"6002d110-e634-47ab-b33b-652cbf7b3466\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.425679 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-combined-ca-bundle\") pod \"6002d110-e634-47ab-b33b-652cbf7b3466\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.425729 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-config-data\") pod \"6002d110-e634-47ab-b33b-652cbf7b3466\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.425750 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-custom-prometheus-ca\") pod \"6002d110-e634-47ab-b33b-652cbf7b3466\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.425927 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d110-e634-47ab-b33b-652cbf7b3466-logs\") pod \"6002d110-e634-47ab-b33b-652cbf7b3466\" (UID: \"6002d110-e634-47ab-b33b-652cbf7b3466\") " Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.427665 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6002d110-e634-47ab-b33b-652cbf7b3466-logs" (OuterVolumeSpecName: "logs") pod "6002d110-e634-47ab-b33b-652cbf7b3466" (UID: "6002d110-e634-47ab-b33b-652cbf7b3466"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.444691 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6002d110-e634-47ab-b33b-652cbf7b3466-kube-api-access-clbzv" (OuterVolumeSpecName: "kube-api-access-clbzv") pod "6002d110-e634-47ab-b33b-652cbf7b3466" (UID: "6002d110-e634-47ab-b33b-652cbf7b3466"). InnerVolumeSpecName "kube-api-access-clbzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.459958 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6002d110-e634-47ab-b33b-652cbf7b3466" (UID: "6002d110-e634-47ab-b33b-652cbf7b3466"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.471148 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "6002d110-e634-47ab-b33b-652cbf7b3466" (UID: "6002d110-e634-47ab-b33b-652cbf7b3466"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.490299 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-config-data" (OuterVolumeSpecName: "config-data") pod "6002d110-e634-47ab-b33b-652cbf7b3466" (UID: "6002d110-e634-47ab-b33b-652cbf7b3466"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.527741 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6002d110-e634-47ab-b33b-652cbf7b3466-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.527773 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clbzv\" (UniqueName: \"kubernetes.io/projected/6002d110-e634-47ab-b33b-652cbf7b3466-kube-api-access-clbzv\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.527782 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.527791 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:01 crc kubenswrapper[4892]: I1006 12:28:01.527799 4892 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/6002d110-e634-47ab-b33b-652cbf7b3466-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.033539 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.140885 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-scripts\") pod \"320898d4-f2c7-4164-830b-ee3d96e34d28\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.141024 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-config-data\") pod \"320898d4-f2c7-4164-830b-ee3d96e34d28\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.141093 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-run-httpd\") pod \"320898d4-f2c7-4164-830b-ee3d96e34d28\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.141134 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfmjt\" (UniqueName: \"kubernetes.io/projected/320898d4-f2c7-4164-830b-ee3d96e34d28-kube-api-access-vfmjt\") pod \"320898d4-f2c7-4164-830b-ee3d96e34d28\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.141153 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-sg-core-conf-yaml\") pod \"320898d4-f2c7-4164-830b-ee3d96e34d28\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.141182 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-log-httpd\") pod \"320898d4-f2c7-4164-830b-ee3d96e34d28\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.141200 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-combined-ca-bundle\") pod \"320898d4-f2c7-4164-830b-ee3d96e34d28\" (UID: \"320898d4-f2c7-4164-830b-ee3d96e34d28\") " Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.141469 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "320898d4-f2c7-4164-830b-ee3d96e34d28" (UID: "320898d4-f2c7-4164-830b-ee3d96e34d28"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.141920 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.145451 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-scripts" (OuterVolumeSpecName: "scripts") pod "320898d4-f2c7-4164-830b-ee3d96e34d28" (UID: "320898d4-f2c7-4164-830b-ee3d96e34d28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.145723 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "320898d4-f2c7-4164-830b-ee3d96e34d28" (UID: "320898d4-f2c7-4164-830b-ee3d96e34d28"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.157204 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320898d4-f2c7-4164-830b-ee3d96e34d28-kube-api-access-vfmjt" (OuterVolumeSpecName: "kube-api-access-vfmjt") pod "320898d4-f2c7-4164-830b-ee3d96e34d28" (UID: "320898d4-f2c7-4164-830b-ee3d96e34d28"). InnerVolumeSpecName "kube-api-access-vfmjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.193990 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "320898d4-f2c7-4164-830b-ee3d96e34d28" (UID: "320898d4-f2c7-4164-830b-ee3d96e34d28"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.252314 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfmjt\" (UniqueName: \"kubernetes.io/projected/320898d4-f2c7-4164-830b-ee3d96e34d28-kube-api-access-vfmjt\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.252373 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.252391 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/320898d4-f2c7-4164-830b-ee3d96e34d28-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.252408 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.256919 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "320898d4-f2c7-4164-830b-ee3d96e34d28" (UID: "320898d4-f2c7-4164-830b-ee3d96e34d28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.294288 4892 generic.go:334] "Generic (PLEG): container finished" podID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerID="d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0" exitCode=0 Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.294391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerDied","Data":"d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0"} Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.294430 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"320898d4-f2c7-4164-830b-ee3d96e34d28","Type":"ContainerDied","Data":"2bab45f02506186ff7a6763168b743907294000b11e4faec3a64df9db62b67d3"} Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.294453 4892 scope.go:117] "RemoveContainer" containerID="0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.294573 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.300941 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.306418 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-config-data" (OuterVolumeSpecName: "config-data") pod "320898d4-f2c7-4164-830b-ee3d96e34d28" (UID: "320898d4-f2c7-4164-830b-ee3d96e34d28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.357206 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.358427 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.358463 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320898d4-f2c7-4164-830b-ee3d96e34d28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.364533 4892 scope.go:117] "RemoveContainer" containerID="b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.373576 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379149 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.379617 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="sg-core" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379629 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="sg-core" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.379648 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379654 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.379662 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="proxy-httpd" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379668 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="proxy-httpd" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.379685 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379690 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.379711 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="ceilometer-notification-agent" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379717 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="ceilometer-notification-agent" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.379728 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="ceilometer-central-agent" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379734 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="ceilometer-central-agent" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379896 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="proxy-httpd" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379918 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379927 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379936 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379944 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379956 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="ceilometer-central-agent" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379969 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="sg-core" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.379980 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" containerName="ceilometer-notification-agent" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.382720 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.386695 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.386708 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.398782 4892 scope.go:117] "RemoveContainer" containerID="d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.423749 4892 scope.go:117] "RemoveContainer" containerID="387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.439672 4892 scope.go:117] "RemoveContainer" containerID="0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.440101 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca\": container with ID starting with 0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca not found: ID does not exist" containerID="0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.440216 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca"} err="failed to get container status \"0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca\": rpc error: code = NotFound desc = could not find container \"0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca\": container with ID starting with 0733a86c85562d91b5ea92064c4ad6e2358104f9cc37a211f24914f4fd208dca not found: ID does not exist" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.440343 4892 scope.go:117] "RemoveContainer" containerID="b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.440695 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59\": container with ID starting with b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59 not found: ID does not exist" containerID="b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.440768 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59"} err="failed to get container status \"b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59\": rpc error: code = NotFound desc = could not find container \"b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59\": container with ID starting with b32ea374029ca195e19ab20e1e6a39f7a666f467a914bc26f3feb3f684e97b59 not found: ID does not exist" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.440805 4892 scope.go:117] "RemoveContainer" containerID="d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.441427 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0\": container with ID starting with d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0 not found: ID does not exist" containerID="d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.441550 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0"} err="failed to get container status \"d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0\": rpc error: code = NotFound desc = could not find container \"d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0\": container with ID starting with d1e5f3f229e7058b1d642a06af5dd9ec7120324339a1339b4dfc87e726c8aae0 not found: ID does not exist" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.441643 4892 scope.go:117] "RemoveContainer" containerID="387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.441937 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa\": container with ID starting with 387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa not found: ID does not exist" containerID="387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.441958 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa"} err="failed to get container status \"387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa\": rpc error: code = NotFound desc = could not find container \"387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa\": container with ID starting with 387c6a30537405249da8bdc99ce32be565b1933fa0ade72fb710ce37ac791eaa not found: ID does not exist" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.562727 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.562863 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.562920 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-logs\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.562963 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.563030 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtf62\" (UniqueName: \"kubernetes.io/projected/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-kube-api-access-vtf62\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.637778 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.653612 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.662380 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.663143 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.663166 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: E1006 12:28:02.663181 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.663188 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" containerName="watcher-decision-engine" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.664316 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.664384 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-logs\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.664413 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.664452 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtf62\" (UniqueName: \"kubernetes.io/projected/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-kube-api-access-vtf62\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.664570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.665058 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-logs\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.665767 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.668633 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.669063 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.669310 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.670602 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.680022 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-config-data\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.691250 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtf62\" (UniqueName: \"kubernetes.io/projected/8aa53cf5-94c5-483e-9ef0-bf823a8abff7-kube-api-access-vtf62\") pod \"watcher-decision-engine-0\" (UID: \"8aa53cf5-94c5-483e-9ef0-bf823a8abff7\") " pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.697516 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.703942 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.766760 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.766863 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.766986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-config-data\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.767053 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-scripts\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.767099 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-log-httpd\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.767151 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-run-httpd\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.767241 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8c4w\" (UniqueName: \"kubernetes.io/projected/a1bbb968-deb3-4ce5-9912-8e452f23b35b-kube-api-access-h8c4w\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.872595 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-run-httpd\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.872715 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8c4w\" (UniqueName: \"kubernetes.io/projected/a1bbb968-deb3-4ce5-9912-8e452f23b35b-kube-api-access-h8c4w\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.872770 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.872822 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.872907 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-config-data\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.872957 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-scripts\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.872987 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-log-httpd\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.873381 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-run-httpd\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.873399 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-log-httpd\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.878073 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-scripts\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.878637 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-config-data\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.879145 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.880746 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.893270 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8c4w\" (UniqueName: \"kubernetes.io/projected/a1bbb968-deb3-4ce5-9912-8e452f23b35b-kube-api-access-h8c4w\") pod \"ceilometer-0\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " pod="openstack/ceilometer-0" Oct 06 12:28:02 crc kubenswrapper[4892]: I1006 12:28:02.998689 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:28:03 crc kubenswrapper[4892]: I1006 12:28:03.175580 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 06 12:28:03 crc kubenswrapper[4892]: W1006 12:28:03.188652 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aa53cf5_94c5_483e_9ef0_bf823a8abff7.slice/crio-0ea9d9fc7a43449f7383e4f0571c5b40615f6361f79c82f6127c623cd6c5dddb WatchSource:0}: Error finding container 0ea9d9fc7a43449f7383e4f0571c5b40615f6361f79c82f6127c623cd6c5dddb: Status 404 returned error can't find the container with id 0ea9d9fc7a43449f7383e4f0571c5b40615f6361f79c82f6127c623cd6c5dddb Oct 06 12:28:03 crc kubenswrapper[4892]: I1006 12:28:03.313747 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8aa53cf5-94c5-483e-9ef0-bf823a8abff7","Type":"ContainerStarted","Data":"0ea9d9fc7a43449f7383e4f0571c5b40615f6361f79c82f6127c623cd6c5dddb"} Oct 06 12:28:03 crc kubenswrapper[4892]: I1006 12:28:03.645461 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:03 crc kubenswrapper[4892]: W1006 12:28:03.652857 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbb968_deb3_4ce5_9912_8e452f23b35b.slice/crio-872bae6933ca3416842ed82308357f618dc8ed53966db36ba8a140a362bd7f0a WatchSource:0}: Error finding container 872bae6933ca3416842ed82308357f618dc8ed53966db36ba8a140a362bd7f0a: Status 404 returned error can't find the container with id 872bae6933ca3416842ed82308357f618dc8ed53966db36ba8a140a362bd7f0a Oct 06 12:28:04 crc kubenswrapper[4892]: I1006 12:28:04.182255 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320898d4-f2c7-4164-830b-ee3d96e34d28" path="/var/lib/kubelet/pods/320898d4-f2c7-4164-830b-ee3d96e34d28/volumes" Oct 06 12:28:04 crc kubenswrapper[4892]: I1006 12:28:04.184005 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6002d110-e634-47ab-b33b-652cbf7b3466" path="/var/lib/kubelet/pods/6002d110-e634-47ab-b33b-652cbf7b3466/volumes" Oct 06 12:28:04 crc kubenswrapper[4892]: I1006 12:28:04.331338 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"8aa53cf5-94c5-483e-9ef0-bf823a8abff7","Type":"ContainerStarted","Data":"1ac4b32e1a5684f54706eabf81c933a8f5e342832b8e687d30179f67ba2d36bf"} Oct 06 12:28:04 crc kubenswrapper[4892]: I1006 12:28:04.351676 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerStarted","Data":"bc33889523b250f2ece09404b373c7622b6ac3391bdc923a873fee1800f74d80"} Oct 06 12:28:04 crc kubenswrapper[4892]: I1006 12:28:04.351727 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerStarted","Data":"d7adf2a25e7308a944dff4171113fc14431e0191db01f9bb204e928abc917580"} Oct 06 12:28:04 crc kubenswrapper[4892]: I1006 12:28:04.351741 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerStarted","Data":"872bae6933ca3416842ed82308357f618dc8ed53966db36ba8a140a362bd7f0a"} Oct 06 12:28:04 crc kubenswrapper[4892]: I1006 12:28:04.387809 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.387786502 podStartE2EDuration="2.387786502s" podCreationTimestamp="2025-10-06 12:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:04.385187366 +0000 UTC m=+1170.934893131" watchObservedRunningTime="2025-10-06 12:28:04.387786502 +0000 UTC m=+1170.937492287" Oct 06 12:28:05 crc kubenswrapper[4892]: I1006 12:28:05.370816 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerStarted","Data":"31ee040d1802be617657ee2e7583e92d9ac815b91a4dc9f9b9e519cb7a6b1ebe"} Oct 06 12:28:06 crc kubenswrapper[4892]: I1006 12:28:06.386592 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerStarted","Data":"bd53166acfea668ca2c0075070652c28dfb0422c60e729afe0eb8e8f5e347803"} Oct 06 12:28:06 crc kubenswrapper[4892]: I1006 12:28:06.388208 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 12:28:06 crc kubenswrapper[4892]: I1006 12:28:06.424791 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.98977547 podStartE2EDuration="4.424772355s" podCreationTimestamp="2025-10-06 12:28:02 +0000 UTC" firstStartedPulling="2025-10-06 12:28:03.655493564 +0000 UTC m=+1170.205199369" lastFinishedPulling="2025-10-06 12:28:06.090490489 +0000 UTC m=+1172.640196254" observedRunningTime="2025-10-06 12:28:06.408462871 +0000 UTC m=+1172.958168646" watchObservedRunningTime="2025-10-06 12:28:06.424772355 +0000 UTC m=+1172.974478130" Oct 06 12:28:09 crc kubenswrapper[4892]: I1006 12:28:09.802925 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 06 12:28:10 crc kubenswrapper[4892]: I1006 12:28:10.426063 4892 generic.go:334] "Generic (PLEG): container finished" podID="812d3ada-a315-4c39-9f56-bb54525a0df2" containerID="40cb54a72745796db114d6a68e744b96d70ceaf7170abf1fa781b59cf666a7da" exitCode=0 Oct 06 12:28:10 crc kubenswrapper[4892]: I1006 12:28:10.426171 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" event={"ID":"812d3ada-a315-4c39-9f56-bb54525a0df2","Type":"ContainerDied","Data":"40cb54a72745796db114d6a68e744b96d70ceaf7170abf1fa781b59cf666a7da"} Oct 06 12:28:11 crc kubenswrapper[4892]: I1006 12:28:11.901278 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.072730 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hjlf\" (UniqueName: \"kubernetes.io/projected/812d3ada-a315-4c39-9f56-bb54525a0df2-kube-api-access-4hjlf\") pod \"812d3ada-a315-4c39-9f56-bb54525a0df2\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.072964 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-combined-ca-bundle\") pod \"812d3ada-a315-4c39-9f56-bb54525a0df2\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.073053 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-scripts\") pod \"812d3ada-a315-4c39-9f56-bb54525a0df2\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.073115 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-config-data\") pod \"812d3ada-a315-4c39-9f56-bb54525a0df2\" (UID: \"812d3ada-a315-4c39-9f56-bb54525a0df2\") " Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.079638 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812d3ada-a315-4c39-9f56-bb54525a0df2-kube-api-access-4hjlf" (OuterVolumeSpecName: "kube-api-access-4hjlf") pod "812d3ada-a315-4c39-9f56-bb54525a0df2" (UID: "812d3ada-a315-4c39-9f56-bb54525a0df2"). InnerVolumeSpecName "kube-api-access-4hjlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.086582 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-scripts" (OuterVolumeSpecName: "scripts") pod "812d3ada-a315-4c39-9f56-bb54525a0df2" (UID: "812d3ada-a315-4c39-9f56-bb54525a0df2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.102934 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-config-data" (OuterVolumeSpecName: "config-data") pod "812d3ada-a315-4c39-9f56-bb54525a0df2" (UID: "812d3ada-a315-4c39-9f56-bb54525a0df2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.112316 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "812d3ada-a315-4c39-9f56-bb54525a0df2" (UID: "812d3ada-a315-4c39-9f56-bb54525a0df2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.175743 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.176037 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.176050 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/812d3ada-a315-4c39-9f56-bb54525a0df2-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.176061 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hjlf\" (UniqueName: \"kubernetes.io/projected/812d3ada-a315-4c39-9f56-bb54525a0df2-kube-api-access-4hjlf\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.454134 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" event={"ID":"812d3ada-a315-4c39-9f56-bb54525a0df2","Type":"ContainerDied","Data":"4b4a88301c46687c7222ef47cf6a352b4f46cdbca4b339709fc6fcda2f17d0d3"} Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.454177 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b4a88301c46687c7222ef47cf6a352b4f46cdbca4b339709fc6fcda2f17d0d3" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.454218 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-fmjq4" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.656192 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 12:28:12 crc kubenswrapper[4892]: E1006 12:28:12.656772 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812d3ada-a315-4c39-9f56-bb54525a0df2" containerName="nova-cell0-conductor-db-sync" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.656795 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="812d3ada-a315-4c39-9f56-bb54525a0df2" containerName="nova-cell0-conductor-db-sync" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.657101 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="812d3ada-a315-4c39-9f56-bb54525a0df2" containerName="nova-cell0-conductor-db-sync" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.657938 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.660763 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hngwc" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.664791 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.689991 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.704417 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.759152 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.787783 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.787961 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.788057 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnf2v\" (UniqueName: \"kubernetes.io/projected/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-kube-api-access-vnf2v\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.889380 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.890106 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnf2v\" (UniqueName: \"kubernetes.io/projected/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-kube-api-access-vnf2v\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.890225 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.896931 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.904848 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.917628 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnf2v\" (UniqueName: \"kubernetes.io/projected/e8ca03eb-d66d-4149-9b53-aeefb4d1478c-kube-api-access-vnf2v\") pod \"nova-cell0-conductor-0\" (UID: \"e8ca03eb-d66d-4149-9b53-aeefb4d1478c\") " pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:12 crc kubenswrapper[4892]: I1006 12:28:12.983075 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:13 crc kubenswrapper[4892]: I1006 12:28:13.461572 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 06 12:28:13 crc kubenswrapper[4892]: I1006 12:28:13.468694 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e8ca03eb-d66d-4149-9b53-aeefb4d1478c","Type":"ContainerStarted","Data":"ce8bc84446903452d0d7465f1c124c9c0ea818af374340ba8a9bb054bd28d1f0"} Oct 06 12:28:13 crc kubenswrapper[4892]: I1006 12:28:13.468752 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 06 12:28:13 crc kubenswrapper[4892]: I1006 12:28:13.518742 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 06 12:28:14 crc kubenswrapper[4892]: I1006 12:28:14.488988 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"e8ca03eb-d66d-4149-9b53-aeefb4d1478c","Type":"ContainerStarted","Data":"38544078ec910549a0301fb8307331c607add4a80315926bd43ac5e6734fdcc0"} Oct 06 12:28:14 crc kubenswrapper[4892]: I1006 12:28:14.489608 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:14 crc kubenswrapper[4892]: I1006 12:28:14.514031 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.513981813 podStartE2EDuration="2.513981813s" podCreationTimestamp="2025-10-06 12:28:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:14.505714483 +0000 UTC m=+1181.055420268" watchObservedRunningTime="2025-10-06 12:28:14.513981813 +0000 UTC m=+1181.063687588" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.014605 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.540030 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5jh8x"] Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.541557 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.544162 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.553208 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5jh8x"] Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.559384 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.613778 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmrw7\" (UniqueName: \"kubernetes.io/projected/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-kube-api-access-zmrw7\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.613844 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.613976 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-config-data\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.614044 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-scripts\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.715206 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-config-data\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.715269 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-scripts\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.715336 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmrw7\" (UniqueName: \"kubernetes.io/projected/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-kube-api-access-zmrw7\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.715366 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.724007 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.724193 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-config-data\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.724383 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-scripts\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.729536 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.731076 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.738688 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.753748 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.758586 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.761708 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.764932 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmrw7\" (UniqueName: \"kubernetes.io/projected/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-kube-api-access-zmrw7\") pod \"nova-cell0-cell-mapping-5jh8x\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.802317 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.815383 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.816631 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlwmk\" (UniqueName: \"kubernetes.io/projected/a10b50f1-1410-48d7-a999-43c250a201de-kube-api-access-xlwmk\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.816757 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c0f2a9-8737-4d05-911d-be085ade827a-logs\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.816850 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-config-data\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.816934 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjvk5\" (UniqueName: \"kubernetes.io/projected/20c0f2a9-8737-4d05-911d-be085ade827a-kube-api-access-zjvk5\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.817074 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.817163 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.817263 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-config-data\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.868431 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.915311 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.916770 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.919682 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlwmk\" (UniqueName: \"kubernetes.io/projected/a10b50f1-1410-48d7-a999-43c250a201de-kube-api-access-xlwmk\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.919825 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c0f2a9-8737-4d05-911d-be085ade827a-logs\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.919924 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-config-data\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.920004 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjvk5\" (UniqueName: \"kubernetes.io/projected/20c0f2a9-8737-4d05-911d-be085ade827a-kube-api-access-zjvk5\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.920145 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.920230 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.920655 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-config-data\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.921059 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.922158 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c0f2a9-8737-4d05-911d-be085ade827a-logs\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.927829 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-config-data\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.939401 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.941070 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-config-data\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.952964 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:23 crc kubenswrapper[4892]: I1006 12:28:23.992040 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlwmk\" (UniqueName: \"kubernetes.io/projected/a10b50f1-1410-48d7-a999-43c250a201de-kube-api-access-xlwmk\") pod \"nova-scheduler-0\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:23.994729 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:23.994894 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjvk5\" (UniqueName: \"kubernetes.io/projected/20c0f2a9-8737-4d05-911d-be085ade827a-kube-api-access-zjvk5\") pod \"nova-api-0\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " pod="openstack/nova-api-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.036426 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.036488 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xds9d\" (UniqueName: \"kubernetes.io/projected/8c792645-87a8-4be5-9a72-927b0b99de2e-kube-api-access-xds9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.036526 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.071388 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.073001 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.077994 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.137763 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.138751 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-logs\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.138795 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb4nd\" (UniqueName: \"kubernetes.io/projected/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-kube-api-access-kb4nd\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.138836 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.138866 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-config-data\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.138895 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.138923 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xds9d\" (UniqueName: \"kubernetes.io/projected/8c792645-87a8-4be5-9a72-927b0b99de2e-kube-api-access-xds9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.139829 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.143966 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.171457 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.226934 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.232256 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xds9d\" (UniqueName: \"kubernetes.io/projected/8c792645-87a8-4be5-9a72-927b0b99de2e-kube-api-access-xds9d\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.232896 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.282194 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.282280 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-config-data\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.282717 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-logs\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.282860 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb4nd\" (UniqueName: \"kubernetes.io/projected/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-kube-api-access-kb4nd\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.294406 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-logs\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.295050 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.295812 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-config-data\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.321017 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb4nd\" (UniqueName: \"kubernetes.io/projected/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-kube-api-access-kb4nd\") pod \"nova-metadata-0\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.427990 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54849b84c9-6zpxg"] Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.430389 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.445387 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.463392 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54849b84c9-6zpxg"] Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.554962 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.607872 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-svc\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.607924 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-config\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.607966 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-swift-storage-0\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.607997 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchqk\" (UniqueName: \"kubernetes.io/projected/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-kube-api-access-wchqk\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.608046 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-nb\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.608090 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-sb\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.652110 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5jh8x"] Oct 06 12:28:24 crc kubenswrapper[4892]: W1006 12:28:24.669677 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49c8ba56_1bfe_4b22_bc88_7d25ac9113d0.slice/crio-d699cea7268452ffbf92107c90614463833f97253da7e7fc47251616b6989b2e WatchSource:0}: Error finding container d699cea7268452ffbf92107c90614463833f97253da7e7fc47251616b6989b2e: Status 404 returned error can't find the container with id d699cea7268452ffbf92107c90614463833f97253da7e7fc47251616b6989b2e Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.710575 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-swift-storage-0\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.710622 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wchqk\" (UniqueName: \"kubernetes.io/projected/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-kube-api-access-wchqk\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.710679 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-nb\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.710727 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-sb\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.710772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-svc\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.710804 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-config\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.711443 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-swift-storage-0\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.711587 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-config\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.711696 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-sb\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.712039 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-svc\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.712159 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-nb\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.731618 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchqk\" (UniqueName: \"kubernetes.io/projected/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-kube-api-access-wchqk\") pod \"dnsmasq-dns-54849b84c9-6zpxg\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.825614 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.903172 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.914939 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rpnxm"] Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.919918 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.921971 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.922697 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.948442 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rpnxm"] Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.960284 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 12:28:24 crc kubenswrapper[4892]: I1006 12:28:24.993028 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:25 crc kubenswrapper[4892]: W1006 12:28:25.008189 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda10b50f1_1410_48d7_a999_43c250a201de.slice/crio-b8de47ee95133c4f1336bd5551c9d78adef5569ba7401ddcb7038c086c0a848a WatchSource:0}: Error finding container b8de47ee95133c4f1336bd5551c9d78adef5569ba7401ddcb7038c086c0a848a: Status 404 returned error can't find the container with id b8de47ee95133c4f1336bd5551c9d78adef5569ba7401ddcb7038c086c0a848a Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.018749 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-config-data\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.018808 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-scripts\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.018871 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.018947 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82rmt\" (UniqueName: \"kubernetes.io/projected/905c9ab8-2e12-4c06-9a9f-890faab36198-kube-api-access-82rmt\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.043966 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:25 crc kubenswrapper[4892]: W1006 12:28:25.048579 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1ea700e_e7ac_444f_8dee_a7172d4b4a49.slice/crio-a7d6d35411cfbd1e45b0792f4b1fdb3a25606bed22b344d976b3f391bfb57674 WatchSource:0}: Error finding container a7d6d35411cfbd1e45b0792f4b1fdb3a25606bed22b344d976b3f391bfb57674: Status 404 returned error can't find the container with id a7d6d35411cfbd1e45b0792f4b1fdb3a25606bed22b344d976b3f391bfb57674 Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.088596 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.120055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-config-data\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.120104 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-scripts\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.120148 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.120224 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82rmt\" (UniqueName: \"kubernetes.io/projected/905c9ab8-2e12-4c06-9a9f-890faab36198-kube-api-access-82rmt\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: W1006 12:28:25.121259 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c792645_87a8_4be5_9a72_927b0b99de2e.slice/crio-cb684b65bfbd065089c9ee7e1942b8e9e6efdf27b6801182f74f33c3bd606cfd WatchSource:0}: Error finding container cb684b65bfbd065089c9ee7e1942b8e9e6efdf27b6801182f74f33c3bd606cfd: Status 404 returned error can't find the container with id cb684b65bfbd065089c9ee7e1942b8e9e6efdf27b6801182f74f33c3bd606cfd Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.126191 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-scripts\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.126952 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.129147 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-config-data\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.135429 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82rmt\" (UniqueName: \"kubernetes.io/projected/905c9ab8-2e12-4c06-9a9f-890faab36198-kube-api-access-82rmt\") pod \"nova-cell1-conductor-db-sync-rpnxm\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.325673 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.392113 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54849b84c9-6zpxg"] Oct 06 12:28:25 crc kubenswrapper[4892]: W1006 12:28:25.410037 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4737ce6_7463_41ae_8dcf_ab3f3f84cb9a.slice/crio-935152dc2096409beb91855b8ed2944612c7e2525afb9d9575e8e1fcfdfa1a0a WatchSource:0}: Error finding container 935152dc2096409beb91855b8ed2944612c7e2525afb9d9575e8e1fcfdfa1a0a: Status 404 returned error can't find the container with id 935152dc2096409beb91855b8ed2944612c7e2525afb9d9575e8e1fcfdfa1a0a Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.638375 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a10b50f1-1410-48d7-a999-43c250a201de","Type":"ContainerStarted","Data":"b8de47ee95133c4f1336bd5551c9d78adef5569ba7401ddcb7038c086c0a848a"} Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.639860 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" event={"ID":"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a","Type":"ContainerStarted","Data":"935152dc2096409beb91855b8ed2944612c7e2525afb9d9575e8e1fcfdfa1a0a"} Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.657068 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1ea700e-e7ac-444f-8dee-a7172d4b4a49","Type":"ContainerStarted","Data":"a7d6d35411cfbd1e45b0792f4b1fdb3a25606bed22b344d976b3f391bfb57674"} Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.668244 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5jh8x" event={"ID":"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0","Type":"ContainerStarted","Data":"26250415420fe80d7f2ca52bdfdda08ae762ecbaf0a1346ee402bb75255619bf"} Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.668667 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5jh8x" event={"ID":"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0","Type":"ContainerStarted","Data":"d699cea7268452ffbf92107c90614463833f97253da7e7fc47251616b6989b2e"} Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.670719 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20c0f2a9-8737-4d05-911d-be085ade827a","Type":"ContainerStarted","Data":"8c24626d298ce78a3d48004efec81d6b44f168d5931b8c9f23b49fc3cfa6abf9"} Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.681001 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c792645-87a8-4be5-9a72-927b0b99de2e","Type":"ContainerStarted","Data":"cb684b65bfbd065089c9ee7e1942b8e9e6efdf27b6801182f74f33c3bd606cfd"} Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.703294 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5jh8x" podStartSLOduration=2.70326655 podStartE2EDuration="2.70326655s" podCreationTimestamp="2025-10-06 12:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:25.688199722 +0000 UTC m=+1192.237905487" watchObservedRunningTime="2025-10-06 12:28:25.70326655 +0000 UTC m=+1192.252972325" Oct 06 12:28:25 crc kubenswrapper[4892]: I1006 12:28:25.948106 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rpnxm"] Oct 06 12:28:26 crc kubenswrapper[4892]: I1006 12:28:26.689806 4892 generic.go:334] "Generic (PLEG): container finished" podID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" containerID="3a0ff92d2b15287d4f53d31677923680c3531cefe8e9360909ce2db12478679f" exitCode=0 Oct 06 12:28:26 crc kubenswrapper[4892]: I1006 12:28:26.689885 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" event={"ID":"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a","Type":"ContainerDied","Data":"3a0ff92d2b15287d4f53d31677923680c3531cefe8e9360909ce2db12478679f"} Oct 06 12:28:27 crc kubenswrapper[4892]: I1006 12:28:27.432389 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:28:27 crc kubenswrapper[4892]: I1006 12:28:27.452476 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:27 crc kubenswrapper[4892]: I1006 12:28:27.701354 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" event={"ID":"905c9ab8-2e12-4c06-9a9f-890faab36198","Type":"ContainerStarted","Data":"914eb5c03c9733879e11323b0301c09bc166efe16cfc243faf87db919aa58762"} Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.723236 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c792645-87a8-4be5-9a72-927b0b99de2e","Type":"ContainerStarted","Data":"c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f"} Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.723881 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8c792645-87a8-4be5-9a72-927b0b99de2e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f" gracePeriod=30 Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.736696 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a10b50f1-1410-48d7-a999-43c250a201de","Type":"ContainerStarted","Data":"fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b"} Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.740832 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1ea700e-e7ac-444f-8dee-a7172d4b4a49","Type":"ContainerStarted","Data":"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b"} Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.758788 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" event={"ID":"905c9ab8-2e12-4c06-9a9f-890faab36198","Type":"ContainerStarted","Data":"018459029c3e97dc59ff073e0b9f90cc653ff818e14cfcc4ba96541ca86fb3f2"} Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.769949 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.750015269 podStartE2EDuration="5.769928876s" podCreationTimestamp="2025-10-06 12:28:23 +0000 UTC" firstStartedPulling="2025-10-06 12:28:25.128299055 +0000 UTC m=+1191.678004820" lastFinishedPulling="2025-10-06 12:28:28.148212662 +0000 UTC m=+1194.697918427" observedRunningTime="2025-10-06 12:28:28.750976375 +0000 UTC m=+1195.300682140" watchObservedRunningTime="2025-10-06 12:28:28.769928876 +0000 UTC m=+1195.319634651" Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.770956 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" event={"ID":"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a","Type":"ContainerStarted","Data":"cfad99feca6c74434b5ca451369cb5b5190835929a02390324a9135835bfbb95"} Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.771104 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.641800892 podStartE2EDuration="5.771094229s" podCreationTimestamp="2025-10-06 12:28:23 +0000 UTC" firstStartedPulling="2025-10-06 12:28:25.013670973 +0000 UTC m=+1191.563376738" lastFinishedPulling="2025-10-06 12:28:28.14296431 +0000 UTC m=+1194.692670075" observedRunningTime="2025-10-06 12:28:28.765745874 +0000 UTC m=+1195.315451639" watchObservedRunningTime="2025-10-06 12:28:28.771094229 +0000 UTC m=+1195.320800024" Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.771719 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.773931 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20c0f2a9-8737-4d05-911d-be085ade827a","Type":"ContainerStarted","Data":"9056e7a48a8bab34d11aa0082d3a806a87ddfd7e1ae89f4b6b3b0f1c1d190f2b"} Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.792617 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" podStartSLOduration=4.792593144 podStartE2EDuration="4.792593144s" podCreationTimestamp="2025-10-06 12:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:28.782291695 +0000 UTC m=+1195.331997460" watchObservedRunningTime="2025-10-06 12:28:28.792593144 +0000 UTC m=+1195.342298909" Oct 06 12:28:28 crc kubenswrapper[4892]: I1006 12:28:28.814100 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" podStartSLOduration=4.8140823390000005 podStartE2EDuration="4.814082339s" podCreationTimestamp="2025-10-06 12:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:28.809391563 +0000 UTC m=+1195.359097328" watchObservedRunningTime="2025-10-06 12:28:28.814082339 +0000 UTC m=+1195.363788104" Oct 06 12:28:29 crc kubenswrapper[4892]: I1006 12:28:29.144396 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 12:28:29 crc kubenswrapper[4892]: I1006 12:28:29.446920 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:29 crc kubenswrapper[4892]: I1006 12:28:29.786260 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1ea700e-e7ac-444f-8dee-a7172d4b4a49","Type":"ContainerStarted","Data":"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d"} Oct 06 12:28:29 crc kubenswrapper[4892]: I1006 12:28:29.786442 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerName="nova-metadata-metadata" containerID="cri-o://49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d" gracePeriod=30 Oct 06 12:28:29 crc kubenswrapper[4892]: I1006 12:28:29.786386 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerName="nova-metadata-log" containerID="cri-o://59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b" gracePeriod=30 Oct 06 12:28:29 crc kubenswrapper[4892]: I1006 12:28:29.793399 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20c0f2a9-8737-4d05-911d-be085ade827a","Type":"ContainerStarted","Data":"ae243610bf6fb1a2b02fc195de4873d10a0019b5a232e1e8562bd959d6fcc066"} Oct 06 12:28:29 crc kubenswrapper[4892]: I1006 12:28:29.831058 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.734656352 podStartE2EDuration="6.831039042s" podCreationTimestamp="2025-10-06 12:28:23 +0000 UTC" firstStartedPulling="2025-10-06 12:28:25.050245476 +0000 UTC m=+1191.599951241" lastFinishedPulling="2025-10-06 12:28:28.146628166 +0000 UTC m=+1194.696333931" observedRunningTime="2025-10-06 12:28:29.816714695 +0000 UTC m=+1196.366420470" watchObservedRunningTime="2025-10-06 12:28:29.831039042 +0000 UTC m=+1196.380744827" Oct 06 12:28:29 crc kubenswrapper[4892]: I1006 12:28:29.845534 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.65856763 podStartE2EDuration="6.845507682s" podCreationTimestamp="2025-10-06 12:28:23 +0000 UTC" firstStartedPulling="2025-10-06 12:28:24.960067435 +0000 UTC m=+1191.509773200" lastFinishedPulling="2025-10-06 12:28:28.147007487 +0000 UTC m=+1194.696713252" observedRunningTime="2025-10-06 12:28:29.841425084 +0000 UTC m=+1196.391130849" watchObservedRunningTime="2025-10-06 12:28:29.845507682 +0000 UTC m=+1196.395213447" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.424348 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.534680 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-combined-ca-bundle\") pod \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.534730 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-logs\") pod \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.534878 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb4nd\" (UniqueName: \"kubernetes.io/projected/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-kube-api-access-kb4nd\") pod \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.534942 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-config-data\") pod \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\" (UID: \"e1ea700e-e7ac-444f-8dee-a7172d4b4a49\") " Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.535650 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-logs" (OuterVolumeSpecName: "logs") pod "e1ea700e-e7ac-444f-8dee-a7172d4b4a49" (UID: "e1ea700e-e7ac-444f-8dee-a7172d4b4a49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.543496 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-kube-api-access-kb4nd" (OuterVolumeSpecName: "kube-api-access-kb4nd") pod "e1ea700e-e7ac-444f-8dee-a7172d4b4a49" (UID: "e1ea700e-e7ac-444f-8dee-a7172d4b4a49"). InnerVolumeSpecName "kube-api-access-kb4nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.576484 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-config-data" (OuterVolumeSpecName: "config-data") pod "e1ea700e-e7ac-444f-8dee-a7172d4b4a49" (UID: "e1ea700e-e7ac-444f-8dee-a7172d4b4a49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.596389 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1ea700e-e7ac-444f-8dee-a7172d4b4a49" (UID: "e1ea700e-e7ac-444f-8dee-a7172d4b4a49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.636817 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.636851 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.636860 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb4nd\" (UniqueName: \"kubernetes.io/projected/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-kube-api-access-kb4nd\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.636870 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1ea700e-e7ac-444f-8dee-a7172d4b4a49-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.805204 4892 generic.go:334] "Generic (PLEG): container finished" podID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerID="49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d" exitCode=0 Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.805235 4892 generic.go:334] "Generic (PLEG): container finished" podID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerID="59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b" exitCode=143 Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.805309 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1ea700e-e7ac-444f-8dee-a7172d4b4a49","Type":"ContainerDied","Data":"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d"} Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.805379 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1ea700e-e7ac-444f-8dee-a7172d4b4a49","Type":"ContainerDied","Data":"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b"} Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.805392 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e1ea700e-e7ac-444f-8dee-a7172d4b4a49","Type":"ContainerDied","Data":"a7d6d35411cfbd1e45b0792f4b1fdb3a25606bed22b344d976b3f391bfb57674"} Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.805411 4892 scope.go:117] "RemoveContainer" containerID="49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.806875 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.837802 4892 scope.go:117] "RemoveContainer" containerID="59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.853969 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.859054 4892 scope.go:117] "RemoveContainer" containerID="49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d" Oct 06 12:28:30 crc kubenswrapper[4892]: E1006 12:28:30.859657 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d\": container with ID starting with 49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d not found: ID does not exist" containerID="49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.859688 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d"} err="failed to get container status \"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d\": rpc error: code = NotFound desc = could not find container \"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d\": container with ID starting with 49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d not found: ID does not exist" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.859712 4892 scope.go:117] "RemoveContainer" containerID="59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b" Oct 06 12:28:30 crc kubenswrapper[4892]: E1006 12:28:30.860026 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b\": container with ID starting with 59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b not found: ID does not exist" containerID="59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.860065 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b"} err="failed to get container status \"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b\": rpc error: code = NotFound desc = could not find container \"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b\": container with ID starting with 59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b not found: ID does not exist" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.860088 4892 scope.go:117] "RemoveContainer" containerID="49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.860376 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d"} err="failed to get container status \"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d\": rpc error: code = NotFound desc = could not find container \"49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d\": container with ID starting with 49f5a4245fff74f52de169696e22c75828b595e4cfabf09ae8505f5d5a1bd75d not found: ID does not exist" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.860390 4892 scope.go:117] "RemoveContainer" containerID="59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.860719 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b"} err="failed to get container status \"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b\": rpc error: code = NotFound desc = could not find container \"59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b\": container with ID starting with 59af19727c0c2a52dbceb535b59f43094589d7d9c42cf1feb0b6cbcf69c7658b not found: ID does not exist" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.871394 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.893376 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:30 crc kubenswrapper[4892]: E1006 12:28:30.893806 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerName="nova-metadata-log" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.893824 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerName="nova-metadata-log" Oct 06 12:28:30 crc kubenswrapper[4892]: E1006 12:28:30.893852 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerName="nova-metadata-metadata" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.893859 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerName="nova-metadata-metadata" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.894028 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerName="nova-metadata-metadata" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.894052 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" containerName="nova-metadata-log" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.895127 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.897460 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.898820 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 12:28:30 crc kubenswrapper[4892]: I1006 12:28:30.912234 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.042875 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.042935 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1fe6c8-3c90-4a54-8175-826d62550610-logs\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.043000 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl9q9\" (UniqueName: \"kubernetes.io/projected/0e1fe6c8-3c90-4a54-8175-826d62550610-kube-api-access-pl9q9\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.043065 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-config-data\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.043113 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.145550 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.145602 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1fe6c8-3c90-4a54-8175-826d62550610-logs\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.145631 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl9q9\" (UniqueName: \"kubernetes.io/projected/0e1fe6c8-3c90-4a54-8175-826d62550610-kube-api-access-pl9q9\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.145673 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-config-data\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.145719 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.146473 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1fe6c8-3c90-4a54-8175-826d62550610-logs\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.155602 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.155690 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.156820 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-config-data\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.161805 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl9q9\" (UniqueName: \"kubernetes.io/projected/0e1fe6c8-3c90-4a54-8175-826d62550610-kube-api-access-pl9q9\") pod \"nova-metadata-0\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.209929 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.706319 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:31 crc kubenswrapper[4892]: W1006 12:28:31.708120 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e1fe6c8_3c90_4a54_8175_826d62550610.slice/crio-edc3f8012707975960f2eb220c0222cd5f4606887c15a0c1f57e334ef746e8d1 WatchSource:0}: Error finding container edc3f8012707975960f2eb220c0222cd5f4606887c15a0c1f57e334ef746e8d1: Status 404 returned error can't find the container with id edc3f8012707975960f2eb220c0222cd5f4606887c15a0c1f57e334ef746e8d1 Oct 06 12:28:31 crc kubenswrapper[4892]: I1006 12:28:31.820228 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0e1fe6c8-3c90-4a54-8175-826d62550610","Type":"ContainerStarted","Data":"edc3f8012707975960f2eb220c0222cd5f4606887c15a0c1f57e334ef746e8d1"} Oct 06 12:28:32 crc kubenswrapper[4892]: I1006 12:28:32.187267 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ea700e-e7ac-444f-8dee-a7172d4b4a49" path="/var/lib/kubelet/pods/e1ea700e-e7ac-444f-8dee-a7172d4b4a49/volumes" Oct 06 12:28:32 crc kubenswrapper[4892]: I1006 12:28:32.831124 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0e1fe6c8-3c90-4a54-8175-826d62550610","Type":"ContainerStarted","Data":"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37"} Oct 06 12:28:32 crc kubenswrapper[4892]: I1006 12:28:32.831168 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0e1fe6c8-3c90-4a54-8175-826d62550610","Type":"ContainerStarted","Data":"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044"} Oct 06 12:28:32 crc kubenswrapper[4892]: I1006 12:28:32.863440 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.863413531 podStartE2EDuration="2.863413531s" podCreationTimestamp="2025-10-06 12:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:32.853278436 +0000 UTC m=+1199.402984251" watchObservedRunningTime="2025-10-06 12:28:32.863413531 +0000 UTC m=+1199.413119306" Oct 06 12:28:33 crc kubenswrapper[4892]: I1006 12:28:33.010995 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 12:28:33 crc kubenswrapper[4892]: I1006 12:28:33.845719 4892 generic.go:334] "Generic (PLEG): container finished" podID="49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" containerID="26250415420fe80d7f2ca52bdfdda08ae762ecbaf0a1346ee402bb75255619bf" exitCode=0 Oct 06 12:28:33 crc kubenswrapper[4892]: I1006 12:28:33.846867 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5jh8x" event={"ID":"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0","Type":"ContainerDied","Data":"26250415420fe80d7f2ca52bdfdda08ae762ecbaf0a1346ee402bb75255619bf"} Oct 06 12:28:34 crc kubenswrapper[4892]: I1006 12:28:34.140883 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 12:28:34 crc kubenswrapper[4892]: I1006 12:28:34.140941 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 12:28:34 crc kubenswrapper[4892]: I1006 12:28:34.145101 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 12:28:34 crc kubenswrapper[4892]: I1006 12:28:34.217710 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 12:28:34 crc kubenswrapper[4892]: I1006 12:28:34.826510 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:28:34 crc kubenswrapper[4892]: I1006 12:28:34.933642 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cddcc8d5f-5nwpd"] Oct 06 12:28:34 crc kubenswrapper[4892]: I1006 12:28:34.933869 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" podUID="2abc9761-3bef-408b-ab0e-2947fc29b250" containerName="dnsmasq-dns" containerID="cri-o://ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b" gracePeriod=10 Oct 06 12:28:34 crc kubenswrapper[4892]: I1006 12:28:34.948065 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.195570 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.236556 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.386009 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.574343 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmrw7\" (UniqueName: \"kubernetes.io/projected/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-kube-api-access-zmrw7\") pod \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.574391 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-config-data\") pod \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.574507 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-combined-ca-bundle\") pod \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.574649 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-scripts\") pod \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\" (UID: \"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.580472 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-kube-api-access-zmrw7" (OuterVolumeSpecName: "kube-api-access-zmrw7") pod "49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" (UID: "49c8ba56-1bfe-4b22-bc88-7d25ac9113d0"). InnerVolumeSpecName "kube-api-access-zmrw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.584298 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-scripts" (OuterVolumeSpecName: "scripts") pod "49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" (UID: "49c8ba56-1bfe-4b22-bc88-7d25ac9113d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.604093 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-config-data" (OuterVolumeSpecName: "config-data") pod "49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" (UID: "49c8ba56-1bfe-4b22-bc88-7d25ac9113d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.609414 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" (UID: "49c8ba56-1bfe-4b22-bc88-7d25ac9113d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.610224 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.677477 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmrw7\" (UniqueName: \"kubernetes.io/projected/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-kube-api-access-zmrw7\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.677515 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.677524 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.677535 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.779185 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-sb\") pod \"2abc9761-3bef-408b-ab0e-2947fc29b250\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.779237 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-nb\") pod \"2abc9761-3bef-408b-ab0e-2947fc29b250\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.779366 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6rlz\" (UniqueName: \"kubernetes.io/projected/2abc9761-3bef-408b-ab0e-2947fc29b250-kube-api-access-q6rlz\") pod \"2abc9761-3bef-408b-ab0e-2947fc29b250\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.779439 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-swift-storage-0\") pod \"2abc9761-3bef-408b-ab0e-2947fc29b250\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.779478 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-config\") pod \"2abc9761-3bef-408b-ab0e-2947fc29b250\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.779583 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-svc\") pod \"2abc9761-3bef-408b-ab0e-2947fc29b250\" (UID: \"2abc9761-3bef-408b-ab0e-2947fc29b250\") " Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.789780 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2abc9761-3bef-408b-ab0e-2947fc29b250-kube-api-access-q6rlz" (OuterVolumeSpecName: "kube-api-access-q6rlz") pod "2abc9761-3bef-408b-ab0e-2947fc29b250" (UID: "2abc9761-3bef-408b-ab0e-2947fc29b250"). InnerVolumeSpecName "kube-api-access-q6rlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.843490 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2abc9761-3bef-408b-ab0e-2947fc29b250" (UID: "2abc9761-3bef-408b-ab0e-2947fc29b250"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.849368 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2abc9761-3bef-408b-ab0e-2947fc29b250" (UID: "2abc9761-3bef-408b-ab0e-2947fc29b250"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.854381 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2abc9761-3bef-408b-ab0e-2947fc29b250" (UID: "2abc9761-3bef-408b-ab0e-2947fc29b250"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.854799 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-config" (OuterVolumeSpecName: "config") pod "2abc9761-3bef-408b-ab0e-2947fc29b250" (UID: "2abc9761-3bef-408b-ab0e-2947fc29b250"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.858931 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2abc9761-3bef-408b-ab0e-2947fc29b250" (UID: "2abc9761-3bef-408b-ab0e-2947fc29b250"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.872152 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5jh8x" event={"ID":"49c8ba56-1bfe-4b22-bc88-7d25ac9113d0","Type":"ContainerDied","Data":"d699cea7268452ffbf92107c90614463833f97253da7e7fc47251616b6989b2e"} Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.872192 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d699cea7268452ffbf92107c90614463833f97253da7e7fc47251616b6989b2e" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.872242 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5jh8x" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.881803 4892 generic.go:334] "Generic (PLEG): container finished" podID="2abc9761-3bef-408b-ab0e-2947fc29b250" containerID="ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b" exitCode=0 Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.881850 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.881904 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" event={"ID":"2abc9761-3bef-408b-ab0e-2947fc29b250","Type":"ContainerDied","Data":"ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b"} Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.881930 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cddcc8d5f-5nwpd" event={"ID":"2abc9761-3bef-408b-ab0e-2947fc29b250","Type":"ContainerDied","Data":"044cc95100493c8ba2c49c7bdb2c77d91af132eeffbdc30c716379b6b0f12006"} Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.881946 4892 scope.go:117] "RemoveContainer" containerID="ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.883974 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.884078 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.884156 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6rlz\" (UniqueName: \"kubernetes.io/projected/2abc9761-3bef-408b-ab0e-2947fc29b250-kube-api-access-q6rlz\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.884226 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.884290 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.884376 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2abc9761-3bef-408b-ab0e-2947fc29b250-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.910817 4892 scope.go:117] "RemoveContainer" containerID="7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.924383 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cddcc8d5f-5nwpd"] Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.932679 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cddcc8d5f-5nwpd"] Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.949483 4892 scope.go:117] "RemoveContainer" containerID="ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b" Oct 06 12:28:35 crc kubenswrapper[4892]: E1006 12:28:35.950421 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b\": container with ID starting with ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b not found: ID does not exist" containerID="ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.950546 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b"} err="failed to get container status \"ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b\": rpc error: code = NotFound desc = could not find container \"ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b\": container with ID starting with ba01cb2985c45f415c9bb739f1c4780cc7c84cb6b439e132a5b8a8dea911824b not found: ID does not exist" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.950637 4892 scope.go:117] "RemoveContainer" containerID="7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c" Oct 06 12:28:35 crc kubenswrapper[4892]: E1006 12:28:35.954423 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c\": container with ID starting with 7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c not found: ID does not exist" containerID="7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c" Oct 06 12:28:35 crc kubenswrapper[4892]: I1006 12:28:35.954533 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c"} err="failed to get container status \"7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c\": rpc error: code = NotFound desc = could not find container \"7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c\": container with ID starting with 7e1fd0a830f817fda0394178279bc68a160b01bf2872ec424d07c810a57d017c not found: ID does not exist" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.022464 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.024353 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-api" containerID="cri-o://ae243610bf6fb1a2b02fc195de4873d10a0019b5a232e1e8562bd959d6fcc066" gracePeriod=30 Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.022728 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-log" containerID="cri-o://9056e7a48a8bab34d11aa0082d3a806a87ddfd7e1ae89f4b6b3b0f1c1d190f2b" gracePeriod=30 Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.072379 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.096732 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.097204 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerName="nova-metadata-log" containerID="cri-o://d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044" gracePeriod=30 Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.097834 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerName="nova-metadata-metadata" containerID="cri-o://d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37" gracePeriod=30 Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.181499 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2abc9761-3bef-408b-ab0e-2947fc29b250" path="/var/lib/kubelet/pods/2abc9761-3bef-408b-ab0e-2947fc29b250/volumes" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.210473 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.210925 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.841794 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.892843 4892 generic.go:334] "Generic (PLEG): container finished" podID="20c0f2a9-8737-4d05-911d-be085ade827a" containerID="9056e7a48a8bab34d11aa0082d3a806a87ddfd7e1ae89f4b6b3b0f1c1d190f2b" exitCode=143 Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.892874 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20c0f2a9-8737-4d05-911d-be085ade827a","Type":"ContainerDied","Data":"9056e7a48a8bab34d11aa0082d3a806a87ddfd7e1ae89f4b6b3b0f1c1d190f2b"} Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.895085 4892 generic.go:334] "Generic (PLEG): container finished" podID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerID="d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37" exitCode=0 Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.895103 4892 generic.go:334] "Generic (PLEG): container finished" podID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerID="d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044" exitCode=143 Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.895172 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0e1fe6c8-3c90-4a54-8175-826d62550610","Type":"ContainerDied","Data":"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37"} Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.895223 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0e1fe6c8-3c90-4a54-8175-826d62550610","Type":"ContainerDied","Data":"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044"} Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.895236 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0e1fe6c8-3c90-4a54-8175-826d62550610","Type":"ContainerDied","Data":"edc3f8012707975960f2eb220c0222cd5f4606887c15a0c1f57e334ef746e8d1"} Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.895253 4892 scope.go:117] "RemoveContainer" containerID="d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.895872 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.898279 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a10b50f1-1410-48d7-a999-43c250a201de" containerName="nova-scheduler-scheduler" containerID="cri-o://fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b" gracePeriod=30 Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.935451 4892 scope.go:117] "RemoveContainer" containerID="d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.994355 4892 scope.go:117] "RemoveContainer" containerID="d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37" Oct 06 12:28:36 crc kubenswrapper[4892]: E1006 12:28:36.994850 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37\": container with ID starting with d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37 not found: ID does not exist" containerID="d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.994893 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37"} err="failed to get container status \"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37\": rpc error: code = NotFound desc = could not find container \"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37\": container with ID starting with d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37 not found: ID does not exist" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.994918 4892 scope.go:117] "RemoveContainer" containerID="d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044" Oct 06 12:28:36 crc kubenswrapper[4892]: E1006 12:28:36.995665 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044\": container with ID starting with d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044 not found: ID does not exist" containerID="d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.995705 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044"} err="failed to get container status \"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044\": rpc error: code = NotFound desc = could not find container \"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044\": container with ID starting with d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044 not found: ID does not exist" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.995732 4892 scope.go:117] "RemoveContainer" containerID="d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.996200 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37"} err="failed to get container status \"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37\": rpc error: code = NotFound desc = could not find container \"d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37\": container with ID starting with d3613cab9b193a556a84a81bf725fabe6c1f2df71ce78153617c28f39d61de37 not found: ID does not exist" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.996255 4892 scope.go:117] "RemoveContainer" containerID="d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044" Oct 06 12:28:36 crc kubenswrapper[4892]: I1006 12:28:36.997390 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044"} err="failed to get container status \"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044\": rpc error: code = NotFound desc = could not find container \"d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044\": container with ID starting with d8d37b2f7a7c1c7f5dd0fa0197cda6dd4e0df7348d105a033c45ae0796291044 not found: ID does not exist" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.019092 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-config-data\") pod \"0e1fe6c8-3c90-4a54-8175-826d62550610\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.019158 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1fe6c8-3c90-4a54-8175-826d62550610-logs\") pod \"0e1fe6c8-3c90-4a54-8175-826d62550610\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.019196 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl9q9\" (UniqueName: \"kubernetes.io/projected/0e1fe6c8-3c90-4a54-8175-826d62550610-kube-api-access-pl9q9\") pod \"0e1fe6c8-3c90-4a54-8175-826d62550610\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.019337 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-nova-metadata-tls-certs\") pod \"0e1fe6c8-3c90-4a54-8175-826d62550610\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.019378 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-combined-ca-bundle\") pod \"0e1fe6c8-3c90-4a54-8175-826d62550610\" (UID: \"0e1fe6c8-3c90-4a54-8175-826d62550610\") " Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.019519 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e1fe6c8-3c90-4a54-8175-826d62550610-logs" (OuterVolumeSpecName: "logs") pod "0e1fe6c8-3c90-4a54-8175-826d62550610" (UID: "0e1fe6c8-3c90-4a54-8175-826d62550610"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.019830 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e1fe6c8-3c90-4a54-8175-826d62550610-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.024593 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e1fe6c8-3c90-4a54-8175-826d62550610-kube-api-access-pl9q9" (OuterVolumeSpecName: "kube-api-access-pl9q9") pod "0e1fe6c8-3c90-4a54-8175-826d62550610" (UID: "0e1fe6c8-3c90-4a54-8175-826d62550610"). InnerVolumeSpecName "kube-api-access-pl9q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.063622 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-config-data" (OuterVolumeSpecName: "config-data") pod "0e1fe6c8-3c90-4a54-8175-826d62550610" (UID: "0e1fe6c8-3c90-4a54-8175-826d62550610"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.065786 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e1fe6c8-3c90-4a54-8175-826d62550610" (UID: "0e1fe6c8-3c90-4a54-8175-826d62550610"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.091845 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "0e1fe6c8-3c90-4a54-8175-826d62550610" (UID: "0e1fe6c8-3c90-4a54-8175-826d62550610"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.121224 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.121270 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl9q9\" (UniqueName: \"kubernetes.io/projected/0e1fe6c8-3c90-4a54-8175-826d62550610-kube-api-access-pl9q9\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.121305 4892 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.121316 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e1fe6c8-3c90-4a54-8175-826d62550610-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.251509 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.259855 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.267606 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:37 crc kubenswrapper[4892]: E1006 12:28:37.268511 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerName="nova-metadata-log" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.268531 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerName="nova-metadata-log" Oct 06 12:28:37 crc kubenswrapper[4892]: E1006 12:28:37.268553 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" containerName="nova-manage" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.268561 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" containerName="nova-manage" Oct 06 12:28:37 crc kubenswrapper[4892]: E1006 12:28:37.268574 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abc9761-3bef-408b-ab0e-2947fc29b250" containerName="init" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.268580 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abc9761-3bef-408b-ab0e-2947fc29b250" containerName="init" Oct 06 12:28:37 crc kubenswrapper[4892]: E1006 12:28:37.268604 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2abc9761-3bef-408b-ab0e-2947fc29b250" containerName="dnsmasq-dns" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.268609 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2abc9761-3bef-408b-ab0e-2947fc29b250" containerName="dnsmasq-dns" Oct 06 12:28:37 crc kubenswrapper[4892]: E1006 12:28:37.268619 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerName="nova-metadata-metadata" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.268625 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerName="nova-metadata-metadata" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.268827 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" containerName="nova-manage" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.269000 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2abc9761-3bef-408b-ab0e-2947fc29b250" containerName="dnsmasq-dns" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.269020 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerName="nova-metadata-log" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.269056 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" containerName="nova-metadata-metadata" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.270146 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.277430 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.280219 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.287679 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.427032 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-logs\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.427287 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.427431 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-config-data\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.427549 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trndf\" (UniqueName: \"kubernetes.io/projected/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-kube-api-access-trndf\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.427655 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.528883 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-logs\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.528933 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.528999 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-config-data\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.529029 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trndf\" (UniqueName: \"kubernetes.io/projected/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-kube-api-access-trndf\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.529071 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.529765 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-logs\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.533910 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.534663 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-config-data\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.541957 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.550856 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trndf\" (UniqueName: \"kubernetes.io/projected/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-kube-api-access-trndf\") pod \"nova-metadata-0\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " pod="openstack/nova-metadata-0" Oct 06 12:28:37 crc kubenswrapper[4892]: I1006 12:28:37.593901 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.187836 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e1fe6c8-3c90-4a54-8175-826d62550610" path="/var/lib/kubelet/pods/0e1fe6c8-3c90-4a54-8175-826d62550610/volumes" Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.188844 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.366704 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.366956 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38" containerName="kube-state-metrics" containerID="cri-o://ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283" gracePeriod=30 Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.808628 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.930018 4892 generic.go:334] "Generic (PLEG): container finished" podID="77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38" containerID="ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283" exitCode=2 Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.930287 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38","Type":"ContainerDied","Data":"ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283"} Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.930313 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38","Type":"ContainerDied","Data":"612d2715cbadcdec45f9ad80e5f6bc23945132a36a6f2bb273bfbe9d2a462247"} Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.930344 4892 scope.go:117] "RemoveContainer" containerID="ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283" Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.930427 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.940116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4","Type":"ContainerStarted","Data":"a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a"} Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.940155 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4","Type":"ContainerStarted","Data":"bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35"} Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.940164 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4","Type":"ContainerStarted","Data":"acc3bfb47282a9d1eb78a0d127709d5087c363c5e4c1086bdd8b8a665189b831"} Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.944761 4892 generic.go:334] "Generic (PLEG): container finished" podID="905c9ab8-2e12-4c06-9a9f-890faab36198" containerID="018459029c3e97dc59ff073e0b9f90cc653ff818e14cfcc4ba96541ca86fb3f2" exitCode=0 Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.944800 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" event={"ID":"905c9ab8-2e12-4c06-9a9f-890faab36198","Type":"ContainerDied","Data":"018459029c3e97dc59ff073e0b9f90cc653ff818e14cfcc4ba96541ca86fb3f2"} Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.966646 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xprcl\" (UniqueName: \"kubernetes.io/projected/77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38-kube-api-access-xprcl\") pod \"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38\" (UID: \"77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38\") " Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.975635 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38-kube-api-access-xprcl" (OuterVolumeSpecName: "kube-api-access-xprcl") pod "77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38" (UID: "77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38"). InnerVolumeSpecName "kube-api-access-xprcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.978164 4892 scope.go:117] "RemoveContainer" containerID="ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283" Oct 06 12:28:38 crc kubenswrapper[4892]: E1006 12:28:38.980812 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283\": container with ID starting with ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283 not found: ID does not exist" containerID="ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283" Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.980848 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283"} err="failed to get container status \"ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283\": rpc error: code = NotFound desc = could not find container \"ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283\": container with ID starting with ff45e317e1a2b9775d7ece9a9135fe8ff5de7854fa9ace4745a465531d0f2283 not found: ID does not exist" Oct 06 12:28:38 crc kubenswrapper[4892]: I1006 12:28:38.981916 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9818999819999998 podStartE2EDuration="1.981899982s" podCreationTimestamp="2025-10-06 12:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:38.968516343 +0000 UTC m=+1205.518222108" watchObservedRunningTime="2025-10-06 12:28:38.981899982 +0000 UTC m=+1205.531605747" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.069822 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xprcl\" (UniqueName: \"kubernetes.io/projected/77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38-kube-api-access-xprcl\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:39 crc kubenswrapper[4892]: E1006 12:28:39.148075 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 12:28:39 crc kubenswrapper[4892]: E1006 12:28:39.151724 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 12:28:39 crc kubenswrapper[4892]: E1006 12:28:39.153228 4892 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 06 12:28:39 crc kubenswrapper[4892]: E1006 12:28:39.153283 4892 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a10b50f1-1410-48d7-a999-43c250a201de" containerName="nova-scheduler-scheduler" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.259482 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.274569 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.307067 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:28:39 crc kubenswrapper[4892]: E1006 12:28:39.307777 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38" containerName="kube-state-metrics" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.307806 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38" containerName="kube-state-metrics" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.308129 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38" containerName="kube-state-metrics" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.309039 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.310777 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.318799 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.320623 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.477467 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6jzx\" (UniqueName: \"kubernetes.io/projected/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-api-access-g6jzx\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.477524 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.477647 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.477670 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.579192 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.579243 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.579347 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6jzx\" (UniqueName: \"kubernetes.io/projected/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-api-access-g6jzx\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.579383 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.593304 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.593413 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.593886 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.622780 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6jzx\" (UniqueName: \"kubernetes.io/projected/fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f-kube-api-access-g6jzx\") pod \"kube-state-metrics-0\" (UID: \"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f\") " pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.644155 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.968060 4892 generic.go:334] "Generic (PLEG): container finished" podID="20c0f2a9-8737-4d05-911d-be085ade827a" containerID="ae243610bf6fb1a2b02fc195de4873d10a0019b5a232e1e8562bd959d6fcc066" exitCode=0 Oct 06 12:28:39 crc kubenswrapper[4892]: I1006 12:28:39.968249 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20c0f2a9-8737-4d05-911d-be085ade827a","Type":"ContainerDied","Data":"ae243610bf6fb1a2b02fc195de4873d10a0019b5a232e1e8562bd959d6fcc066"} Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.134601 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.182049 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38" path="/var/lib/kubelet/pods/77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38/volumes" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.510786 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.523971 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.600945 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-scripts\") pod \"905c9ab8-2e12-4c06-9a9f-890faab36198\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.601459 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-combined-ca-bundle\") pod \"905c9ab8-2e12-4c06-9a9f-890faab36198\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.601597 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-config-data\") pod \"905c9ab8-2e12-4c06-9a9f-890faab36198\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.601682 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82rmt\" (UniqueName: \"kubernetes.io/projected/905c9ab8-2e12-4c06-9a9f-890faab36198-kube-api-access-82rmt\") pod \"905c9ab8-2e12-4c06-9a9f-890faab36198\" (UID: \"905c9ab8-2e12-4c06-9a9f-890faab36198\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.606959 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905c9ab8-2e12-4c06-9a9f-890faab36198-kube-api-access-82rmt" (OuterVolumeSpecName: "kube-api-access-82rmt") pod "905c9ab8-2e12-4c06-9a9f-890faab36198" (UID: "905c9ab8-2e12-4c06-9a9f-890faab36198"). InnerVolumeSpecName "kube-api-access-82rmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.607019 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-scripts" (OuterVolumeSpecName: "scripts") pod "905c9ab8-2e12-4c06-9a9f-890faab36198" (UID: "905c9ab8-2e12-4c06-9a9f-890faab36198"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.633691 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "905c9ab8-2e12-4c06-9a9f-890faab36198" (UID: "905c9ab8-2e12-4c06-9a9f-890faab36198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.637423 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-config-data" (OuterVolumeSpecName: "config-data") pod "905c9ab8-2e12-4c06-9a9f-890faab36198" (UID: "905c9ab8-2e12-4c06-9a9f-890faab36198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.703786 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c0f2a9-8737-4d05-911d-be085ade827a-logs\") pod \"20c0f2a9-8737-4d05-911d-be085ade827a\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.703929 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjvk5\" (UniqueName: \"kubernetes.io/projected/20c0f2a9-8737-4d05-911d-be085ade827a-kube-api-access-zjvk5\") pod \"20c0f2a9-8737-4d05-911d-be085ade827a\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.704002 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-config-data\") pod \"20c0f2a9-8737-4d05-911d-be085ade827a\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.704030 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-combined-ca-bundle\") pod \"20c0f2a9-8737-4d05-911d-be085ade827a\" (UID: \"20c0f2a9-8737-4d05-911d-be085ade827a\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.704630 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.704652 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.704666 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905c9ab8-2e12-4c06-9a9f-890faab36198-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.704679 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82rmt\" (UniqueName: \"kubernetes.io/projected/905c9ab8-2e12-4c06-9a9f-890faab36198-kube-api-access-82rmt\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.705720 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c0f2a9-8737-4d05-911d-be085ade827a-logs" (OuterVolumeSpecName: "logs") pod "20c0f2a9-8737-4d05-911d-be085ade827a" (UID: "20c0f2a9-8737-4d05-911d-be085ade827a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.737643 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c0f2a9-8737-4d05-911d-be085ade827a-kube-api-access-zjvk5" (OuterVolumeSpecName: "kube-api-access-zjvk5") pod "20c0f2a9-8737-4d05-911d-be085ade827a" (UID: "20c0f2a9-8737-4d05-911d-be085ade827a"). InnerVolumeSpecName "kube-api-access-zjvk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.754459 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-config-data" (OuterVolumeSpecName: "config-data") pod "20c0f2a9-8737-4d05-911d-be085ade827a" (UID: "20c0f2a9-8737-4d05-911d-be085ade827a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.769298 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20c0f2a9-8737-4d05-911d-be085ade827a" (UID: "20c0f2a9-8737-4d05-911d-be085ade827a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.794740 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.809765 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c0f2a9-8737-4d05-911d-be085ade827a-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.809804 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjvk5\" (UniqueName: \"kubernetes.io/projected/20c0f2a9-8737-4d05-911d-be085ade827a-kube-api-access-zjvk5\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.809813 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.809822 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0f2a9-8737-4d05-911d-be085ade827a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.892970 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.893397 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="ceilometer-central-agent" containerID="cri-o://d7adf2a25e7308a944dff4171113fc14431e0191db01f9bb204e928abc917580" gracePeriod=30 Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.893772 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="sg-core" containerID="cri-o://31ee040d1802be617657ee2e7583e92d9ac815b91a4dc9f9b9e519cb7a6b1ebe" gracePeriod=30 Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.893778 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="proxy-httpd" containerID="cri-o://bd53166acfea668ca2c0075070652c28dfb0422c60e729afe0eb8e8f5e347803" gracePeriod=30 Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.893833 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="ceilometer-notification-agent" containerID="cri-o://bc33889523b250f2ece09404b373c7622b6ac3391bdc923a873fee1800f74d80" gracePeriod=30 Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.910831 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-config-data\") pod \"a10b50f1-1410-48d7-a999-43c250a201de\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.911089 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-combined-ca-bundle\") pod \"a10b50f1-1410-48d7-a999-43c250a201de\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.911136 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlwmk\" (UniqueName: \"kubernetes.io/projected/a10b50f1-1410-48d7-a999-43c250a201de-kube-api-access-xlwmk\") pod \"a10b50f1-1410-48d7-a999-43c250a201de\" (UID: \"a10b50f1-1410-48d7-a999-43c250a201de\") " Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.916483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10b50f1-1410-48d7-a999-43c250a201de-kube-api-access-xlwmk" (OuterVolumeSpecName: "kube-api-access-xlwmk") pod "a10b50f1-1410-48d7-a999-43c250a201de" (UID: "a10b50f1-1410-48d7-a999-43c250a201de"). InnerVolumeSpecName "kube-api-access-xlwmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.941003 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a10b50f1-1410-48d7-a999-43c250a201de" (UID: "a10b50f1-1410-48d7-a999-43c250a201de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.951671 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-config-data" (OuterVolumeSpecName: "config-data") pod "a10b50f1-1410-48d7-a999-43c250a201de" (UID: "a10b50f1-1410-48d7-a999-43c250a201de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.980784 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20c0f2a9-8737-4d05-911d-be085ade827a","Type":"ContainerDied","Data":"8c24626d298ce78a3d48004efec81d6b44f168d5931b8c9f23b49fc3cfa6abf9"} Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.980846 4892 scope.go:117] "RemoveContainer" containerID="ae243610bf6fb1a2b02fc195de4873d10a0019b5a232e1e8562bd959d6fcc066" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.980834 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.982541 4892 generic.go:334] "Generic (PLEG): container finished" podID="a10b50f1-1410-48d7-a999-43c250a201de" containerID="fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b" exitCode=0 Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.982582 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.982582 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a10b50f1-1410-48d7-a999-43c250a201de","Type":"ContainerDied","Data":"fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b"} Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.982634 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a10b50f1-1410-48d7-a999-43c250a201de","Type":"ContainerDied","Data":"b8de47ee95133c4f1336bd5551c9d78adef5569ba7401ddcb7038c086c0a848a"} Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.984744 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" event={"ID":"905c9ab8-2e12-4c06-9a9f-890faab36198","Type":"ContainerDied","Data":"914eb5c03c9733879e11323b0301c09bc166efe16cfc243faf87db919aa58762"} Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.984782 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="914eb5c03c9733879e11323b0301c09bc166efe16cfc243faf87db919aa58762" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.984755 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rpnxm" Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.987075 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f","Type":"ContainerStarted","Data":"2ba1702ad910c28cf10aa0369ce78282aa885a5d1dd345724a127a85c8cef358"} Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.987111 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f","Type":"ContainerStarted","Data":"87e2ef05400787e813eccdf6de078f4d9912f40f4a1aa6afd3b860cfc732d9e7"} Oct 06 12:28:40 crc kubenswrapper[4892]: I1006 12:28:40.987779 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.013081 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.013107 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlwmk\" (UniqueName: \"kubernetes.io/projected/a10b50f1-1410-48d7-a999-43c250a201de-kube-api-access-xlwmk\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.013116 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10b50f1-1410-48d7-a999-43c250a201de-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.018784 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.607394734 podStartE2EDuration="2.018767193s" podCreationTimestamp="2025-10-06 12:28:39 +0000 UTC" firstStartedPulling="2025-10-06 12:28:40.129711918 +0000 UTC m=+1206.679417683" lastFinishedPulling="2025-10-06 12:28:40.541084377 +0000 UTC m=+1207.090790142" observedRunningTime="2025-10-06 12:28:41.011721398 +0000 UTC m=+1207.561427163" watchObservedRunningTime="2025-10-06 12:28:41.018767193 +0000 UTC m=+1207.568472958" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.021622 4892 scope.go:117] "RemoveContainer" containerID="9056e7a48a8bab34d11aa0082d3a806a87ddfd7e1ae89f4b6b3b0f1c1d190f2b" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.056126 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.066475 4892 scope.go:117] "RemoveContainer" containerID="fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.070547 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.100454 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: E1006 12:28:41.101033 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-api" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.101063 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-api" Oct 06 12:28:41 crc kubenswrapper[4892]: E1006 12:28:41.101095 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10b50f1-1410-48d7-a999-43c250a201de" containerName="nova-scheduler-scheduler" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.101103 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10b50f1-1410-48d7-a999-43c250a201de" containerName="nova-scheduler-scheduler" Oct 06 12:28:41 crc kubenswrapper[4892]: E1006 12:28:41.101115 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-log" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.101123 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-log" Oct 06 12:28:41 crc kubenswrapper[4892]: E1006 12:28:41.101137 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905c9ab8-2e12-4c06-9a9f-890faab36198" containerName="nova-cell1-conductor-db-sync" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.101145 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="905c9ab8-2e12-4c06-9a9f-890faab36198" containerName="nova-cell1-conductor-db-sync" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.101392 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="905c9ab8-2e12-4c06-9a9f-890faab36198" containerName="nova-cell1-conductor-db-sync" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.101415 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-log" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.101431 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10b50f1-1410-48d7-a999-43c250a201de" containerName="nova-scheduler-scheduler" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.101450 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" containerName="nova-api-api" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.102250 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.105977 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.109516 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.110751 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.112737 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.118224 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.130097 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.130407 4892 scope.go:117] "RemoveContainer" containerID="fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.134092 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.141423 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: E1006 12:28:41.144748 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b\": container with ID starting with fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b not found: ID does not exist" containerID="fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.144789 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b"} err="failed to get container status \"fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b\": rpc error: code = NotFound desc = could not find container \"fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b\": container with ID starting with fe5d0c2532f0bc4d6a00d90adecde25bd9ad28f77c67494b7b0a98a14d86ff8b not found: ID does not exist" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.151272 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.152961 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.161635 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.205930 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218069 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-config-data\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218139 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d101022-48e0-4666-b28f-e4dd08f380ad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218167 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8825961a-8b70-42b1-826f-28ed5d76499f-logs\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218195 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-config-data\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218268 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jswzw\" (UniqueName: \"kubernetes.io/projected/4d101022-48e0-4666-b28f-e4dd08f380ad-kube-api-access-jswzw\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218458 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218539 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f7fp\" (UniqueName: \"kubernetes.io/projected/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-kube-api-access-9f7fp\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218564 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218633 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d101022-48e0-4666-b28f-e4dd08f380ad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.218702 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xchc4\" (UniqueName: \"kubernetes.io/projected/8825961a-8b70-42b1-826f-28ed5d76499f-kube-api-access-xchc4\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.320771 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-config-data\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.320826 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d101022-48e0-4666-b28f-e4dd08f380ad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.320852 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8825961a-8b70-42b1-826f-28ed5d76499f-logs\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.320898 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-config-data\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.320986 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jswzw\" (UniqueName: \"kubernetes.io/projected/4d101022-48e0-4666-b28f-e4dd08f380ad-kube-api-access-jswzw\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.321042 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.321117 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f7fp\" (UniqueName: \"kubernetes.io/projected/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-kube-api-access-9f7fp\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.321138 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.321202 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d101022-48e0-4666-b28f-e4dd08f380ad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.321271 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xchc4\" (UniqueName: \"kubernetes.io/projected/8825961a-8b70-42b1-826f-28ed5d76499f-kube-api-access-xchc4\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.324422 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8825961a-8b70-42b1-826f-28ed5d76499f-logs\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.326177 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d101022-48e0-4666-b28f-e4dd08f380ad-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.326237 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-config-data\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.328129 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.328136 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-config-data\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.331869 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d101022-48e0-4666-b28f-e4dd08f380ad-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.332953 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.344814 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f7fp\" (UniqueName: \"kubernetes.io/projected/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-kube-api-access-9f7fp\") pod \"nova-scheduler-0\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.344959 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jswzw\" (UniqueName: \"kubernetes.io/projected/4d101022-48e0-4666-b28f-e4dd08f380ad-kube-api-access-jswzw\") pod \"nova-cell1-conductor-0\" (UID: \"4d101022-48e0-4666-b28f-e4dd08f380ad\") " pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.345971 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xchc4\" (UniqueName: \"kubernetes.io/projected/8825961a-8b70-42b1-826f-28ed5d76499f-kube-api-access-xchc4\") pod \"nova-api-0\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " pod="openstack/nova-api-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.508472 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.522186 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:41 crc kubenswrapper[4892]: I1006 12:28:41.535854 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.001498 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerID="bd53166acfea668ca2c0075070652c28dfb0422c60e729afe0eb8e8f5e347803" exitCode=0 Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.001793 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerID="31ee040d1802be617657ee2e7583e92d9ac815b91a4dc9f9b9e519cb7a6b1ebe" exitCode=2 Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.001802 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerID="d7adf2a25e7308a944dff4171113fc14431e0191db01f9bb204e928abc917580" exitCode=0 Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.001522 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerDied","Data":"bd53166acfea668ca2c0075070652c28dfb0422c60e729afe0eb8e8f5e347803"} Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.001870 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerDied","Data":"31ee040d1802be617657ee2e7583e92d9ac815b91a4dc9f9b9e519cb7a6b1ebe"} Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.001884 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerDied","Data":"d7adf2a25e7308a944dff4171113fc14431e0191db01f9bb204e928abc917580"} Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.017550 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.191019 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c0f2a9-8737-4d05-911d-be085ade827a" path="/var/lib/kubelet/pods/20c0f2a9-8737-4d05-911d-be085ade827a/volumes" Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.191891 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10b50f1-1410-48d7-a999-43c250a201de" path="/var/lib/kubelet/pods/a10b50f1-1410-48d7-a999-43c250a201de/volumes" Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.211172 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.222909 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.594737 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 12:28:42 crc kubenswrapper[4892]: I1006 12:28:42.594781 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.028124 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8825961a-8b70-42b1-826f-28ed5d76499f","Type":"ContainerStarted","Data":"11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b"} Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.028187 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8825961a-8b70-42b1-826f-28ed5d76499f","Type":"ContainerStarted","Data":"30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5"} Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.028207 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8825961a-8b70-42b1-826f-28ed5d76499f","Type":"ContainerStarted","Data":"21d5a64b46383c96c32491c3d58597e83db8475307251d940c108bef6a4c5370"} Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.032011 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4d101022-48e0-4666-b28f-e4dd08f380ad","Type":"ContainerStarted","Data":"9b92ce5e646e0e43e93db236e46c0bfae445493df45f58e2f534a8731f4fe831"} Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.032079 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.032097 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4d101022-48e0-4666-b28f-e4dd08f380ad","Type":"ContainerStarted","Data":"bd56b73f27dec4db56f1a9554635dbfdd873ea2b069c95438bcc18f4465d363e"} Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.034457 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40e26ba4-7fe8-4ae2-9839-9775c60c7e90","Type":"ContainerStarted","Data":"8ce6783a0f879b03f4a2fb9cdc1048d4383acb87ac4baa532a77af16419c5c5d"} Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.034485 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40e26ba4-7fe8-4ae2-9839-9775c60c7e90","Type":"ContainerStarted","Data":"60447312dd9768b19712fd444f2e15cff94f8867455fd19f6f9c23e8bd6f23c4"} Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.047538 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.047526367 podStartE2EDuration="2.047526367s" podCreationTimestamp="2025-10-06 12:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:43.046651782 +0000 UTC m=+1209.596357547" watchObservedRunningTime="2025-10-06 12:28:43.047526367 +0000 UTC m=+1209.597232132" Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.067937 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.06791431 podStartE2EDuration="2.06791431s" podCreationTimestamp="2025-10-06 12:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:43.064532972 +0000 UTC m=+1209.614238747" watchObservedRunningTime="2025-10-06 12:28:43.06791431 +0000 UTC m=+1209.617620085" Oct 06 12:28:43 crc kubenswrapper[4892]: I1006 12:28:43.090258 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.090240539 podStartE2EDuration="2.090240539s" podCreationTimestamp="2025-10-06 12:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:28:43.083795682 +0000 UTC m=+1209.633501457" watchObservedRunningTime="2025-10-06 12:28:43.090240539 +0000 UTC m=+1209.639946314" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.072983 4892 generic.go:334] "Generic (PLEG): container finished" podID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerID="bc33889523b250f2ece09404b373c7622b6ac3391bdc923a873fee1800f74d80" exitCode=0 Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.073155 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerDied","Data":"bc33889523b250f2ece09404b373c7622b6ac3391bdc923a873fee1800f74d80"} Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.495158 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.509304 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.627813 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-scripts\") pod \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.627888 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8c4w\" (UniqueName: \"kubernetes.io/projected/a1bbb968-deb3-4ce5-9912-8e452f23b35b-kube-api-access-h8c4w\") pod \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.627921 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-combined-ca-bundle\") pod \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.628007 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-log-httpd\") pod \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.628036 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-sg-core-conf-yaml\") pod \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.628196 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-run-httpd\") pod \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.628265 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-config-data\") pod \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\" (UID: \"a1bbb968-deb3-4ce5-9912-8e452f23b35b\") " Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.628737 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a1bbb968-deb3-4ce5-9912-8e452f23b35b" (UID: "a1bbb968-deb3-4ce5-9912-8e452f23b35b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.628972 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.629072 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a1bbb968-deb3-4ce5-9912-8e452f23b35b" (UID: "a1bbb968-deb3-4ce5-9912-8e452f23b35b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.635268 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-scripts" (OuterVolumeSpecName: "scripts") pod "a1bbb968-deb3-4ce5-9912-8e452f23b35b" (UID: "a1bbb968-deb3-4ce5-9912-8e452f23b35b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.637314 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1bbb968-deb3-4ce5-9912-8e452f23b35b-kube-api-access-h8c4w" (OuterVolumeSpecName: "kube-api-access-h8c4w") pod "a1bbb968-deb3-4ce5-9912-8e452f23b35b" (UID: "a1bbb968-deb3-4ce5-9912-8e452f23b35b"). InnerVolumeSpecName "kube-api-access-h8c4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.727834 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a1bbb968-deb3-4ce5-9912-8e452f23b35b" (UID: "a1bbb968-deb3-4ce5-9912-8e452f23b35b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.731333 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.731373 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1bbb968-deb3-4ce5-9912-8e452f23b35b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.731386 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.731398 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8c4w\" (UniqueName: \"kubernetes.io/projected/a1bbb968-deb3-4ce5-9912-8e452f23b35b-kube-api-access-h8c4w\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.754305 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1bbb968-deb3-4ce5-9912-8e452f23b35b" (UID: "a1bbb968-deb3-4ce5-9912-8e452f23b35b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.767558 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-config-data" (OuterVolumeSpecName: "config-data") pod "a1bbb968-deb3-4ce5-9912-8e452f23b35b" (UID: "a1bbb968-deb3-4ce5-9912-8e452f23b35b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.833825 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:46 crc kubenswrapper[4892]: I1006 12:28:46.833863 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bbb968-deb3-4ce5-9912-8e452f23b35b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.088278 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1bbb968-deb3-4ce5-9912-8e452f23b35b","Type":"ContainerDied","Data":"872bae6933ca3416842ed82308357f618dc8ed53966db36ba8a140a362bd7f0a"} Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.088397 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.088687 4892 scope.go:117] "RemoveContainer" containerID="bd53166acfea668ca2c0075070652c28dfb0422c60e729afe0eb8e8f5e347803" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.130076 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.132679 4892 scope.go:117] "RemoveContainer" containerID="31ee040d1802be617657ee2e7583e92d9ac815b91a4dc9f9b9e519cb7a6b1ebe" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.140061 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.169234 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:47 crc kubenswrapper[4892]: E1006 12:28:47.169903 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="sg-core" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.169923 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="sg-core" Oct 06 12:28:47 crc kubenswrapper[4892]: E1006 12:28:47.169939 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="ceilometer-notification-agent" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.169945 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="ceilometer-notification-agent" Oct 06 12:28:47 crc kubenswrapper[4892]: E1006 12:28:47.169960 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="proxy-httpd" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.169966 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="proxy-httpd" Oct 06 12:28:47 crc kubenswrapper[4892]: E1006 12:28:47.169989 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="ceilometer-central-agent" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.169995 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="ceilometer-central-agent" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.170195 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="ceilometer-central-agent" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.170210 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="proxy-httpd" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.170218 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="sg-core" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.170229 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" containerName="ceilometer-notification-agent" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.171955 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.175157 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.175246 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.175388 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.179680 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.184661 4892 scope.go:117] "RemoveContainer" containerID="bc33889523b250f2ece09404b373c7622b6ac3391bdc923a873fee1800f74d80" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.226387 4892 scope.go:117] "RemoveContainer" containerID="d7adf2a25e7308a944dff4171113fc14431e0191db01f9bb204e928abc917580" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.243485 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-config-data\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.243524 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.243789 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbgkl\" (UniqueName: \"kubernetes.io/projected/630beda2-2a6b-43e4-a495-12b3938e3139-kube-api-access-bbgkl\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.243867 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.243916 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.243937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-scripts\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.243983 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-run-httpd\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.244762 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-log-httpd\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.347166 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbgkl\" (UniqueName: \"kubernetes.io/projected/630beda2-2a6b-43e4-a495-12b3938e3139-kube-api-access-bbgkl\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.347276 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.347413 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.347458 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-scripts\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.347527 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-run-httpd\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.347630 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-log-httpd\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.347713 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-config-data\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.347757 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.348313 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-run-httpd\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.348513 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-log-httpd\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.352672 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-scripts\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.354130 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.355746 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-config-data\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.355858 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.360685 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.362935 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbgkl\" (UniqueName: \"kubernetes.io/projected/630beda2-2a6b-43e4-a495-12b3938e3139-kube-api-access-bbgkl\") pod \"ceilometer-0\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.513127 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.594662 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 12:28:47 crc kubenswrapper[4892]: I1006 12:28:47.594706 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 12:28:48 crc kubenswrapper[4892]: I1006 12:28:48.029272 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:28:48 crc kubenswrapper[4892]: I1006 12:28:48.104078 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerStarted","Data":"2591b70f94202f39d3d37860149a27cbe8f931c3293382c0d17138e570d9a8f9"} Oct 06 12:28:48 crc kubenswrapper[4892]: I1006 12:28:48.189265 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1bbb968-deb3-4ce5-9912-8e452f23b35b" path="/var/lib/kubelet/pods/a1bbb968-deb3-4ce5-9912-8e452f23b35b/volumes" Oct 06 12:28:48 crc kubenswrapper[4892]: I1006 12:28:48.611458 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 12:28:48 crc kubenswrapper[4892]: I1006 12:28:48.611463 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.217:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 12:28:49 crc kubenswrapper[4892]: I1006 12:28:49.115011 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerStarted","Data":"17869140300ce402b3a44f363171d108e7f6759020f18697b67e1dbc7e8dd881"} Oct 06 12:28:49 crc kubenswrapper[4892]: I1006 12:28:49.115293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerStarted","Data":"a573102c5e65eb1daaacefa307398f3d7f668695ece2a67065b4aaa6f91e0d78"} Oct 06 12:28:49 crc kubenswrapper[4892]: I1006 12:28:49.659708 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 06 12:28:49 crc kubenswrapper[4892]: E1006 12:28:49.938000 4892 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/dbdabc3d06754b1e00f1c7dc77240fc3d9b7cc5f00eead1a0db00d4073b15c0b/diff" to get inode usage: stat /var/lib/containers/storage/overlay/dbdabc3d06754b1e00f1c7dc77240fc3d9b7cc5f00eead1a0db00d4073b15c0b/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_kube-state-metrics-0_77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38/kube-state-metrics/0.log" to get inode usage: stat /var/log/pods/openstack_kube-state-metrics-0_77d2ae66-c6bc-42e4-9e5e-cf7a9b9b9d38/kube-state-metrics/0.log: no such file or directory Oct 06 12:28:50 crc kubenswrapper[4892]: I1006 12:28:50.130217 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerStarted","Data":"d732ccb1e7a4efc6eed26b003d6d300368f2035a1c9be1f9e8b2b5fc1b345fc0"} Oct 06 12:28:51 crc kubenswrapper[4892]: I1006 12:28:51.149921 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerStarted","Data":"b97ae8a4b05a8e04a20845c85bb3296caeb106dc7330fc0274890c65c5bda6bc"} Oct 06 12:28:51 crc kubenswrapper[4892]: I1006 12:28:51.150479 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 12:28:51 crc kubenswrapper[4892]: I1006 12:28:51.188886 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.552072312 podStartE2EDuration="4.188860362s" podCreationTimestamp="2025-10-06 12:28:47 +0000 UTC" firstStartedPulling="2025-10-06 12:28:48.038232765 +0000 UTC m=+1214.587938540" lastFinishedPulling="2025-10-06 12:28:50.675020825 +0000 UTC m=+1217.224726590" observedRunningTime="2025-10-06 12:28:51.173883906 +0000 UTC m=+1217.723589711" watchObservedRunningTime="2025-10-06 12:28:51.188860362 +0000 UTC m=+1217.738566167" Oct 06 12:28:51 crc kubenswrapper[4892]: I1006 12:28:51.509522 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 12:28:51 crc kubenswrapper[4892]: I1006 12:28:51.537634 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 12:28:51 crc kubenswrapper[4892]: I1006 12:28:51.537806 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 12:28:51 crc kubenswrapper[4892]: I1006 12:28:51.542859 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 12:28:51 crc kubenswrapper[4892]: I1006 12:28:51.571546 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 06 12:28:52 crc kubenswrapper[4892]: I1006 12:28:52.213987 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 12:28:52 crc kubenswrapper[4892]: I1006 12:28:52.579505 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:28:52 crc kubenswrapper[4892]: I1006 12:28:52.579844 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 06 12:28:52 crc kubenswrapper[4892]: I1006 12:28:52.984472 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:28:52 crc kubenswrapper[4892]: I1006 12:28:52.984534 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:28:57 crc kubenswrapper[4892]: I1006 12:28:57.605958 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 12:28:57 crc kubenswrapper[4892]: I1006 12:28:57.611548 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 12:28:57 crc kubenswrapper[4892]: I1006 12:28:57.617628 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 12:28:58 crc kubenswrapper[4892]: I1006 12:28:58.259037 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 12:28:58 crc kubenswrapper[4892]: E1006 12:28:58.777034 4892 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda10b50f1_1410_48d7_a999_43c250a201de.slice/crio-b8de47ee95133c4f1336bd5551c9d78adef5569ba7401ddcb7038c086c0a848a: Error finding container b8de47ee95133c4f1336bd5551c9d78adef5569ba7401ddcb7038c086c0a848a: Status 404 returned error can't find the container with id b8de47ee95133c4f1336bd5551c9d78adef5569ba7401ddcb7038c086c0a848a Oct 06 12:28:58 crc kubenswrapper[4892]: E1006 12:28:58.783510 4892 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbb968_deb3_4ce5_9912_8e452f23b35b.slice/crio-872bae6933ca3416842ed82308357f618dc8ed53966db36ba8a140a362bd7f0a: Error finding container 872bae6933ca3416842ed82308357f618dc8ed53966db36ba8a140a362bd7f0a: Status 404 returned error can't find the container with id 872bae6933ca3416842ed82308357f618dc8ed53966db36ba8a140a362bd7f0a Oct 06 12:28:58 crc kubenswrapper[4892]: E1006 12:28:58.786170 4892 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e1fe6c8_3c90_4a54_8175_826d62550610.slice/crio-edc3f8012707975960f2eb220c0222cd5f4606887c15a0c1f57e334ef746e8d1: Error finding container edc3f8012707975960f2eb220c0222cd5f4606887c15a0c1f57e334ef746e8d1: Status 404 returned error can't find the container with id edc3f8012707975960f2eb220c0222cd5f4606887c15a0c1f57e334ef746e8d1 Oct 06 12:28:59 crc kubenswrapper[4892]: E1006 12:28:59.046579 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbb968_deb3_4ce5_9912_8e452f23b35b.slice/crio-conmon-bc33889523b250f2ece09404b373c7622b6ac3391bdc923a873fee1800f74d80.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c792645_87a8_4be5_9a72_927b0b99de2e.slice/crio-conmon-c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c792645_87a8_4be5_9a72_927b0b99de2e.slice/crio-c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbb968_deb3_4ce5_9912_8e452f23b35b.slice/crio-bc33889523b250f2ece09404b373c7622b6ac3391bdc923a873fee1800f74d80.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbb968_deb3_4ce5_9912_8e452f23b35b.slice/crio-d7adf2a25e7308a944dff4171113fc14431e0191db01f9bb204e928abc917580.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbb968_deb3_4ce5_9912_8e452f23b35b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1bbb968_deb3_4ce5_9912_8e452f23b35b.slice/crio-conmon-d7adf2a25e7308a944dff4171113fc14431e0191db01f9bb204e928abc917580.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.177288 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.235854 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-config-data\") pod \"8c792645-87a8-4be5-9a72-927b0b99de2e\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.236645 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-combined-ca-bundle\") pod \"8c792645-87a8-4be5-9a72-927b0b99de2e\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.236725 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xds9d\" (UniqueName: \"kubernetes.io/projected/8c792645-87a8-4be5-9a72-927b0b99de2e-kube-api-access-xds9d\") pod \"8c792645-87a8-4be5-9a72-927b0b99de2e\" (UID: \"8c792645-87a8-4be5-9a72-927b0b99de2e\") " Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.243983 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c792645-87a8-4be5-9a72-927b0b99de2e-kube-api-access-xds9d" (OuterVolumeSpecName: "kube-api-access-xds9d") pod "8c792645-87a8-4be5-9a72-927b0b99de2e" (UID: "8c792645-87a8-4be5-9a72-927b0b99de2e"). InnerVolumeSpecName "kube-api-access-xds9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.267462 4892 generic.go:334] "Generic (PLEG): container finished" podID="8c792645-87a8-4be5-9a72-927b0b99de2e" containerID="c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f" exitCode=137 Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.267561 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.267632 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c792645-87a8-4be5-9a72-927b0b99de2e","Type":"ContainerDied","Data":"c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f"} Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.267682 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8c792645-87a8-4be5-9a72-927b0b99de2e","Type":"ContainerDied","Data":"cb684b65bfbd065089c9ee7e1942b8e9e6efdf27b6801182f74f33c3bd606cfd"} Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.267713 4892 scope.go:117] "RemoveContainer" containerID="c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.273113 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-config-data" (OuterVolumeSpecName: "config-data") pod "8c792645-87a8-4be5-9a72-927b0b99de2e" (UID: "8c792645-87a8-4be5-9a72-927b0b99de2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.284422 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c792645-87a8-4be5-9a72-927b0b99de2e" (UID: "8c792645-87a8-4be5-9a72-927b0b99de2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.342823 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xds9d\" (UniqueName: \"kubernetes.io/projected/8c792645-87a8-4be5-9a72-927b0b99de2e-kube-api-access-xds9d\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.342847 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.342860 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c792645-87a8-4be5-9a72-927b0b99de2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.342895 4892 scope.go:117] "RemoveContainer" containerID="c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f" Oct 06 12:28:59 crc kubenswrapper[4892]: E1006 12:28:59.343706 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f\": container with ID starting with c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f not found: ID does not exist" containerID="c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.343738 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f"} err="failed to get container status \"c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f\": rpc error: code = NotFound desc = could not find container \"c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f\": container with ID starting with c67c94234a686c95f2da0849c4322c1de5344031e4d5cd14d880132719669a3f not found: ID does not exist" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.643232 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.663574 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.674661 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:28:59 crc kubenswrapper[4892]: E1006 12:28:59.675097 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c792645-87a8-4be5-9a72-927b0b99de2e" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.675111 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c792645-87a8-4be5-9a72-927b0b99de2e" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.675403 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c792645-87a8-4be5-9a72-927b0b99de2e" containerName="nova-cell1-novncproxy-novncproxy" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.686909 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.687077 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.690201 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.690542 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.690680 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.751642 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.751730 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.751798 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.751889 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.751992 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpgm2\" (UniqueName: \"kubernetes.io/projected/67c67192-e4bc-41c1-894c-692a7641934b-kube-api-access-mpgm2\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.853824 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.853894 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.853954 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.854059 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.854158 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpgm2\" (UniqueName: \"kubernetes.io/projected/67c67192-e4bc-41c1-894c-692a7641934b-kube-api-access-mpgm2\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.858864 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.860565 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.860902 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.861992 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67c67192-e4bc-41c1-894c-692a7641934b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:28:59 crc kubenswrapper[4892]: I1006 12:28:59.875500 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpgm2\" (UniqueName: \"kubernetes.io/projected/67c67192-e4bc-41c1-894c-692a7641934b-kube-api-access-mpgm2\") pod \"nova-cell1-novncproxy-0\" (UID: \"67c67192-e4bc-41c1-894c-692a7641934b\") " pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:29:00 crc kubenswrapper[4892]: I1006 12:29:00.017901 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:29:00 crc kubenswrapper[4892]: I1006 12:29:00.189691 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c792645-87a8-4be5-9a72-927b0b99de2e" path="/var/lib/kubelet/pods/8c792645-87a8-4be5-9a72-927b0b99de2e/volumes" Oct 06 12:29:00 crc kubenswrapper[4892]: I1006 12:29:00.514775 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 06 12:29:00 crc kubenswrapper[4892]: W1006 12:29:00.525824 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67c67192_e4bc_41c1_894c_692a7641934b.slice/crio-0be50cd24d49dd16ef4c8e495f9f3ea27451acaf20d829f36ef3ddd331fae4cc WatchSource:0}: Error finding container 0be50cd24d49dd16ef4c8e495f9f3ea27451acaf20d829f36ef3ddd331fae4cc: Status 404 returned error can't find the container with id 0be50cd24d49dd16ef4c8e495f9f3ea27451acaf20d829f36ef3ddd331fae4cc Oct 06 12:29:01 crc kubenswrapper[4892]: I1006 12:29:01.323699 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"67c67192-e4bc-41c1-894c-692a7641934b","Type":"ContainerStarted","Data":"5e3ab3d8e65c5d5a2df9a0f523433257d76080ebbd837c0d4926a0e30f36466a"} Oct 06 12:29:01 crc kubenswrapper[4892]: I1006 12:29:01.324035 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"67c67192-e4bc-41c1-894c-692a7641934b","Type":"ContainerStarted","Data":"0be50cd24d49dd16ef4c8e495f9f3ea27451acaf20d829f36ef3ddd331fae4cc"} Oct 06 12:29:01 crc kubenswrapper[4892]: I1006 12:29:01.347984 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.3477968860000002 podStartE2EDuration="2.347796886s" podCreationTimestamp="2025-10-06 12:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:29:01.336874639 +0000 UTC m=+1227.886580404" watchObservedRunningTime="2025-10-06 12:29:01.347796886 +0000 UTC m=+1227.897502651" Oct 06 12:29:01 crc kubenswrapper[4892]: I1006 12:29:01.548041 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 12:29:01 crc kubenswrapper[4892]: I1006 12:29:01.548386 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 12:29:01 crc kubenswrapper[4892]: I1006 12:29:01.551159 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 12:29:01 crc kubenswrapper[4892]: I1006 12:29:01.558172 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.337127 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.349029 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.559254 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5988d6d7-6bx28"] Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.561530 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.616535 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5988d6d7-6bx28"] Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.641780 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-config\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.641856 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dnp4\" (UniqueName: \"kubernetes.io/projected/d9454f4b-def4-456d-8a43-1bb27e3d89bc-kube-api-access-9dnp4\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.642023 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-svc\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.642193 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.642309 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.642445 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.745061 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.744163 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.745225 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.745898 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.746240 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-config\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.746285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dnp4\" (UniqueName: \"kubernetes.io/projected/d9454f4b-def4-456d-8a43-1bb27e3d89bc-kube-api-access-9dnp4\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.747254 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-config\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.747424 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-svc\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.748348 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-svc\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.748456 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.749281 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.767408 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dnp4\" (UniqueName: \"kubernetes.io/projected/d9454f4b-def4-456d-8a43-1bb27e3d89bc-kube-api-access-9dnp4\") pod \"dnsmasq-dns-7d5988d6d7-6bx28\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:02 crc kubenswrapper[4892]: I1006 12:29:02.941362 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:03 crc kubenswrapper[4892]: I1006 12:29:03.405572 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5988d6d7-6bx28"] Oct 06 12:29:03 crc kubenswrapper[4892]: W1006 12:29:03.407066 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9454f4b_def4_456d_8a43_1bb27e3d89bc.slice/crio-1c9d86697adf389e53f4dcaa040e5f00f40b966540c7cab1d6832ce8aaee598e WatchSource:0}: Error finding container 1c9d86697adf389e53f4dcaa040e5f00f40b966540c7cab1d6832ce8aaee598e: Status 404 returned error can't find the container with id 1c9d86697adf389e53f4dcaa040e5f00f40b966540c7cab1d6832ce8aaee598e Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.364705 4892 generic.go:334] "Generic (PLEG): container finished" podID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" containerID="7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630" exitCode=0 Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.366054 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" event={"ID":"d9454f4b-def4-456d-8a43-1bb27e3d89bc","Type":"ContainerDied","Data":"7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630"} Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.366089 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" event={"ID":"d9454f4b-def4-456d-8a43-1bb27e3d89bc","Type":"ContainerStarted","Data":"1c9d86697adf389e53f4dcaa040e5f00f40b966540c7cab1d6832ce8aaee598e"} Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.569246 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.569556 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="ceilometer-central-agent" containerID="cri-o://a573102c5e65eb1daaacefa307398f3d7f668695ece2a67065b4aaa6f91e0d78" gracePeriod=30 Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.570076 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="ceilometer-notification-agent" containerID="cri-o://17869140300ce402b3a44f363171d108e7f6759020f18697b67e1dbc7e8dd881" gracePeriod=30 Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.570133 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="sg-core" containerID="cri-o://d732ccb1e7a4efc6eed26b003d6d300368f2035a1c9be1f9e8b2b5fc1b345fc0" gracePeriod=30 Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.569698 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="proxy-httpd" containerID="cri-o://b97ae8a4b05a8e04a20845c85bb3296caeb106dc7330fc0274890c65c5bda6bc" gracePeriod=30 Oct 06 12:29:04 crc kubenswrapper[4892]: I1006 12:29:04.575890 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.222:3000/\": read tcp 10.217.0.2:50386->10.217.0.222:3000: read: connection reset by peer" Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.018319 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.258054 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.392054 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" event={"ID":"d9454f4b-def4-456d-8a43-1bb27e3d89bc","Type":"ContainerStarted","Data":"19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b"} Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.392394 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.404064 4892 generic.go:334] "Generic (PLEG): container finished" podID="630beda2-2a6b-43e4-a495-12b3938e3139" containerID="b97ae8a4b05a8e04a20845c85bb3296caeb106dc7330fc0274890c65c5bda6bc" exitCode=0 Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.404108 4892 generic.go:334] "Generic (PLEG): container finished" podID="630beda2-2a6b-43e4-a495-12b3938e3139" containerID="d732ccb1e7a4efc6eed26b003d6d300368f2035a1c9be1f9e8b2b5fc1b345fc0" exitCode=2 Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.404123 4892 generic.go:334] "Generic (PLEG): container finished" podID="630beda2-2a6b-43e4-a495-12b3938e3139" containerID="a573102c5e65eb1daaacefa307398f3d7f668695ece2a67065b4aaa6f91e0d78" exitCode=0 Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.404305 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-log" containerID="cri-o://30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5" gracePeriod=30 Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.404497 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerDied","Data":"b97ae8a4b05a8e04a20845c85bb3296caeb106dc7330fc0274890c65c5bda6bc"} Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.404558 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerDied","Data":"d732ccb1e7a4efc6eed26b003d6d300368f2035a1c9be1f9e8b2b5fc1b345fc0"} Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.404573 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerDied","Data":"a573102c5e65eb1daaacefa307398f3d7f668695ece2a67065b4aaa6f91e0d78"} Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.404622 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-api" containerID="cri-o://11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b" gracePeriod=30 Oct 06 12:29:05 crc kubenswrapper[4892]: I1006 12:29:05.430673 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" podStartSLOduration=3.430650062 podStartE2EDuration="3.430650062s" podCreationTimestamp="2025-10-06 12:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:29:05.417906562 +0000 UTC m=+1231.967612337" watchObservedRunningTime="2025-10-06 12:29:05.430650062 +0000 UTC m=+1231.980355837" Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.429944 4892 generic.go:334] "Generic (PLEG): container finished" podID="8825961a-8b70-42b1-826f-28ed5d76499f" containerID="30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5" exitCode=143 Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.430486 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8825961a-8b70-42b1-826f-28ed5d76499f","Type":"ContainerDied","Data":"30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5"} Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.847743 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.936990 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xchc4\" (UniqueName: \"kubernetes.io/projected/8825961a-8b70-42b1-826f-28ed5d76499f-kube-api-access-xchc4\") pod \"8825961a-8b70-42b1-826f-28ed5d76499f\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.937191 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-config-data\") pod \"8825961a-8b70-42b1-826f-28ed5d76499f\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.937305 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-combined-ca-bundle\") pod \"8825961a-8b70-42b1-826f-28ed5d76499f\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.937450 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8825961a-8b70-42b1-826f-28ed5d76499f-logs\") pod \"8825961a-8b70-42b1-826f-28ed5d76499f\" (UID: \"8825961a-8b70-42b1-826f-28ed5d76499f\") " Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.938945 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8825961a-8b70-42b1-826f-28ed5d76499f-logs" (OuterVolumeSpecName: "logs") pod "8825961a-8b70-42b1-826f-28ed5d76499f" (UID: "8825961a-8b70-42b1-826f-28ed5d76499f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.939438 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8825961a-8b70-42b1-826f-28ed5d76499f-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.945396 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8825961a-8b70-42b1-826f-28ed5d76499f-kube-api-access-xchc4" (OuterVolumeSpecName: "kube-api-access-xchc4") pod "8825961a-8b70-42b1-826f-28ed5d76499f" (UID: "8825961a-8b70-42b1-826f-28ed5d76499f"). InnerVolumeSpecName "kube-api-access-xchc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.978449 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-config-data" (OuterVolumeSpecName: "config-data") pod "8825961a-8b70-42b1-826f-28ed5d76499f" (UID: "8825961a-8b70-42b1-826f-28ed5d76499f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:06 crc kubenswrapper[4892]: I1006 12:29:06.989737 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8825961a-8b70-42b1-826f-28ed5d76499f" (UID: "8825961a-8b70-42b1-826f-28ed5d76499f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.042483 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xchc4\" (UniqueName: \"kubernetes.io/projected/8825961a-8b70-42b1-826f-28ed5d76499f-kube-api-access-xchc4\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.042513 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.042523 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8825961a-8b70-42b1-826f-28ed5d76499f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.443136 4892 generic.go:334] "Generic (PLEG): container finished" podID="8825961a-8b70-42b1-826f-28ed5d76499f" containerID="11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b" exitCode=0 Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.443175 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8825961a-8b70-42b1-826f-28ed5d76499f","Type":"ContainerDied","Data":"11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b"} Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.443205 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8825961a-8b70-42b1-826f-28ed5d76499f","Type":"ContainerDied","Data":"21d5a64b46383c96c32491c3d58597e83db8475307251d940c108bef6a4c5370"} Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.443227 4892 scope.go:117] "RemoveContainer" containerID="11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.443255 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.538482 4892 scope.go:117] "RemoveContainer" containerID="30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.547179 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.569490 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.577840 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.578052 4892 scope.go:117] "RemoveContainer" containerID="11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b" Oct 06 12:29:07 crc kubenswrapper[4892]: E1006 12:29:07.578464 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-api" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.578488 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-api" Oct 06 12:29:07 crc kubenswrapper[4892]: E1006 12:29:07.578505 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-log" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.578512 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-log" Oct 06 12:29:07 crc kubenswrapper[4892]: E1006 12:29:07.578683 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b\": container with ID starting with 11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b not found: ID does not exist" containerID="11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.578742 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b"} err="failed to get container status \"11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b\": rpc error: code = NotFound desc = could not find container \"11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b\": container with ID starting with 11b839ebe7d13878248ff4c3c52f7a01d4bea35000854a4f59de269556cec38b not found: ID does not exist" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.578785 4892 scope.go:117] "RemoveContainer" containerID="30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.578787 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-api" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.578891 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" containerName="nova-api-log" Oct 06 12:29:07 crc kubenswrapper[4892]: E1006 12:29:07.579117 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5\": container with ID starting with 30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5 not found: ID does not exist" containerID="30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.579160 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5"} err="failed to get container status \"30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5\": rpc error: code = NotFound desc = could not find container \"30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5\": container with ID starting with 30f43dee247a7f79720fce17cf6a16f3e9a363505e39777ba5639dd6259ec7f5 not found: ID does not exist" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.580313 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.585901 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.586190 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.586364 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.586680 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.662154 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e3ba83-e6d6-4890-b74a-540e333433a9-logs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.662412 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9bgx\" (UniqueName: \"kubernetes.io/projected/b5e3ba83-e6d6-4890-b74a-540e333433a9-kube-api-access-l9bgx\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.662520 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.662597 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.662685 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.662801 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-config-data\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.764480 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9bgx\" (UniqueName: \"kubernetes.io/projected/b5e3ba83-e6d6-4890-b74a-540e333433a9-kube-api-access-l9bgx\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.764527 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.764555 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.764578 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.764638 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-config-data\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.764706 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e3ba83-e6d6-4890-b74a-540e333433a9-logs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.765123 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e3ba83-e6d6-4890-b74a-540e333433a9-logs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.770019 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.770566 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-public-tls-certs\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.771422 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-config-data\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.789047 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9bgx\" (UniqueName: \"kubernetes.io/projected/b5e3ba83-e6d6-4890-b74a-540e333433a9-kube-api-access-l9bgx\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.793570 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " pod="openstack/nova-api-0" Oct 06 12:29:07 crc kubenswrapper[4892]: I1006 12:29:07.905063 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:29:08 crc kubenswrapper[4892]: I1006 12:29:08.187606 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8825961a-8b70-42b1-826f-28ed5d76499f" path="/var/lib/kubelet/pods/8825961a-8b70-42b1-826f-28ed5d76499f/volumes" Oct 06 12:29:08 crc kubenswrapper[4892]: I1006 12:29:08.402107 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:08 crc kubenswrapper[4892]: I1006 12:29:08.454642 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5e3ba83-e6d6-4890-b74a-540e333433a9","Type":"ContainerStarted","Data":"3951acae36bf8f7455fa6ae0fdd7294bfb1185da52247af09d4a586ff588ea7c"} Oct 06 12:29:09 crc kubenswrapper[4892]: E1006 12:29:09.308712 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630beda2_2a6b_43e4_a495_12b3938e3139.slice/crio-17869140300ce402b3a44f363171d108e7f6759020f18697b67e1dbc7e8dd881.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.478693 4892 generic.go:334] "Generic (PLEG): container finished" podID="630beda2-2a6b-43e4-a495-12b3938e3139" containerID="17869140300ce402b3a44f363171d108e7f6759020f18697b67e1dbc7e8dd881" exitCode=0 Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.479016 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerDied","Data":"17869140300ce402b3a44f363171d108e7f6759020f18697b67e1dbc7e8dd881"} Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.482516 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5e3ba83-e6d6-4890-b74a-540e333433a9","Type":"ContainerStarted","Data":"c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43"} Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.482553 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5e3ba83-e6d6-4890-b74a-540e333433a9","Type":"ContainerStarted","Data":"2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae"} Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.518580 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.518561105 podStartE2EDuration="2.518561105s" podCreationTimestamp="2025-10-06 12:29:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:29:09.507128101 +0000 UTC m=+1236.056833866" watchObservedRunningTime="2025-10-06 12:29:09.518561105 +0000 UTC m=+1236.068266860" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.692707 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819198 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-run-httpd\") pod \"630beda2-2a6b-43e4-a495-12b3938e3139\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819460 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-log-httpd\") pod \"630beda2-2a6b-43e4-a495-12b3938e3139\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819515 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-sg-core-conf-yaml\") pod \"630beda2-2a6b-43e4-a495-12b3938e3139\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819565 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-scripts\") pod \"630beda2-2a6b-43e4-a495-12b3938e3139\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819566 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "630beda2-2a6b-43e4-a495-12b3938e3139" (UID: "630beda2-2a6b-43e4-a495-12b3938e3139"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819612 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-config-data\") pod \"630beda2-2a6b-43e4-a495-12b3938e3139\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819653 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-combined-ca-bundle\") pod \"630beda2-2a6b-43e4-a495-12b3938e3139\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819698 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbgkl\" (UniqueName: \"kubernetes.io/projected/630beda2-2a6b-43e4-a495-12b3938e3139-kube-api-access-bbgkl\") pod \"630beda2-2a6b-43e4-a495-12b3938e3139\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.819727 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-ceilometer-tls-certs\") pod \"630beda2-2a6b-43e4-a495-12b3938e3139\" (UID: \"630beda2-2a6b-43e4-a495-12b3938e3139\") " Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.820217 4892 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.820559 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "630beda2-2a6b-43e4-a495-12b3938e3139" (UID: "630beda2-2a6b-43e4-a495-12b3938e3139"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.824811 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-scripts" (OuterVolumeSpecName: "scripts") pod "630beda2-2a6b-43e4-a495-12b3938e3139" (UID: "630beda2-2a6b-43e4-a495-12b3938e3139"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.830349 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630beda2-2a6b-43e4-a495-12b3938e3139-kube-api-access-bbgkl" (OuterVolumeSpecName: "kube-api-access-bbgkl") pod "630beda2-2a6b-43e4-a495-12b3938e3139" (UID: "630beda2-2a6b-43e4-a495-12b3938e3139"). InnerVolumeSpecName "kube-api-access-bbgkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.864653 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "630beda2-2a6b-43e4-a495-12b3938e3139" (UID: "630beda2-2a6b-43e4-a495-12b3938e3139"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.888422 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "630beda2-2a6b-43e4-a495-12b3938e3139" (UID: "630beda2-2a6b-43e4-a495-12b3938e3139"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.916483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "630beda2-2a6b-43e4-a495-12b3938e3139" (UID: "630beda2-2a6b-43e4-a495-12b3938e3139"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.922093 4892 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/630beda2-2a6b-43e4-a495-12b3938e3139-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.922124 4892 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.922138 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.922146 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.922155 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbgkl\" (UniqueName: \"kubernetes.io/projected/630beda2-2a6b-43e4-a495-12b3938e3139-kube-api-access-bbgkl\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.922165 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:09 crc kubenswrapper[4892]: I1006 12:29:09.937912 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-config-data" (OuterVolumeSpecName: "config-data") pod "630beda2-2a6b-43e4-a495-12b3938e3139" (UID: "630beda2-2a6b-43e4-a495-12b3938e3139"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.018057 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.023103 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/630beda2-2a6b-43e4-a495-12b3938e3139-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.040257 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.501770 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.501734 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"630beda2-2a6b-43e4-a495-12b3938e3139","Type":"ContainerDied","Data":"2591b70f94202f39d3d37860149a27cbe8f931c3293382c0d17138e570d9a8f9"} Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.502201 4892 scope.go:117] "RemoveContainer" containerID="b97ae8a4b05a8e04a20845c85bb3296caeb106dc7330fc0274890c65c5bda6bc" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.540280 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.542088 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.544605 4892 scope.go:117] "RemoveContainer" containerID="d732ccb1e7a4efc6eed26b003d6d300368f2035a1c9be1f9e8b2b5fc1b345fc0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.568636 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.579969 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:29:10 crc kubenswrapper[4892]: E1006 12:29:10.580441 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="proxy-httpd" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.580465 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="proxy-httpd" Oct 06 12:29:10 crc kubenswrapper[4892]: E1006 12:29:10.580505 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="sg-core" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.580514 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="sg-core" Oct 06 12:29:10 crc kubenswrapper[4892]: E1006 12:29:10.580532 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="ceilometer-central-agent" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.580540 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="ceilometer-central-agent" Oct 06 12:29:10 crc kubenswrapper[4892]: E1006 12:29:10.580553 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="ceilometer-notification-agent" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.580563 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="ceilometer-notification-agent" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.580795 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="ceilometer-central-agent" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.580816 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="sg-core" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.580830 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="proxy-httpd" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.580839 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" containerName="ceilometer-notification-agent" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.584237 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.586860 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.587193 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.590773 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.611281 4892 scope.go:117] "RemoveContainer" containerID="17869140300ce402b3a44f363171d108e7f6759020f18697b67e1dbc7e8dd881" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.617168 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.636627 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.636767 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-config-data\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.636876 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085bae0e-dea7-4bb8-8cf9-3855eff9336c-run-httpd\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.636898 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.636937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085bae0e-dea7-4bb8-8cf9-3855eff9336c-log-httpd\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.636974 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfdwm\" (UniqueName: \"kubernetes.io/projected/085bae0e-dea7-4bb8-8cf9-3855eff9336c-kube-api-access-hfdwm\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.637020 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-scripts\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.637059 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.670953 4892 scope.go:117] "RemoveContainer" containerID="a573102c5e65eb1daaacefa307398f3d7f668695ece2a67065b4aaa6f91e0d78" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.732827 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-m6vm8"] Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.734457 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.738570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-config-data\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.738717 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085bae0e-dea7-4bb8-8cf9-3855eff9336c-run-httpd\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.738756 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.738822 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085bae0e-dea7-4bb8-8cf9-3855eff9336c-log-httpd\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.738908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfdwm\" (UniqueName: \"kubernetes.io/projected/085bae0e-dea7-4bb8-8cf9-3855eff9336c-kube-api-access-hfdwm\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.739035 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-scripts\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.739086 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.739152 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.740661 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.740763 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085bae0e-dea7-4bb8-8cf9-3855eff9336c-log-httpd\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.740797 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/085bae0e-dea7-4bb8-8cf9-3855eff9336c-run-httpd\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.740862 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.742840 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-m6vm8"] Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.744795 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-scripts\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.744793 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.744926 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-config-data\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.748655 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.761976 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfdwm\" (UniqueName: \"kubernetes.io/projected/085bae0e-dea7-4bb8-8cf9-3855eff9336c-kube-api-access-hfdwm\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.762109 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/085bae0e-dea7-4bb8-8cf9-3855eff9336c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"085bae0e-dea7-4bb8-8cf9-3855eff9336c\") " pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.840495 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.840569 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxk2n\" (UniqueName: \"kubernetes.io/projected/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-kube-api-access-dxk2n\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.840625 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-scripts\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.840666 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-config-data\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.908688 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.942181 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-scripts\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.942248 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-config-data\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.942406 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.942474 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxk2n\" (UniqueName: \"kubernetes.io/projected/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-kube-api-access-dxk2n\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.949519 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-config-data\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.954910 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-scripts\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.957897 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:10 crc kubenswrapper[4892]: I1006 12:29:10.962663 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxk2n\" (UniqueName: \"kubernetes.io/projected/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-kube-api-access-dxk2n\") pod \"nova-cell1-cell-mapping-m6vm8\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:11 crc kubenswrapper[4892]: I1006 12:29:11.147197 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:11 crc kubenswrapper[4892]: I1006 12:29:11.381565 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 06 12:29:11 crc kubenswrapper[4892]: I1006 12:29:11.532934 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085bae0e-dea7-4bb8-8cf9-3855eff9336c","Type":"ContainerStarted","Data":"1e9a3cc71c4b1968d446ead8abdcea228bd83b39b9ea6c1ac95e278951d6d019"} Oct 06 12:29:11 crc kubenswrapper[4892]: I1006 12:29:11.603122 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-m6vm8"] Oct 06 12:29:11 crc kubenswrapper[4892]: W1006 12:29:11.626574 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0f88b14_cdad_4ccd_865b_6f57c82a1a8a.slice/crio-6310ede416773dedcf549fbca16778ba2896ff6da7bf1e1d589c99e6a09e0872 WatchSource:0}: Error finding container 6310ede416773dedcf549fbca16778ba2896ff6da7bf1e1d589c99e6a09e0872: Status 404 returned error can't find the container with id 6310ede416773dedcf549fbca16778ba2896ff6da7bf1e1d589c99e6a09e0872 Oct 06 12:29:12 crc kubenswrapper[4892]: I1006 12:29:12.187475 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630beda2-2a6b-43e4-a495-12b3938e3139" path="/var/lib/kubelet/pods/630beda2-2a6b-43e4-a495-12b3938e3139/volumes" Oct 06 12:29:12 crc kubenswrapper[4892]: I1006 12:29:12.548254 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085bae0e-dea7-4bb8-8cf9-3855eff9336c","Type":"ContainerStarted","Data":"6370850a2c916c55c055b48a330071a2cb2e15e1b9fdabc31b2c69defb46f45b"} Oct 06 12:29:12 crc kubenswrapper[4892]: I1006 12:29:12.548669 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085bae0e-dea7-4bb8-8cf9-3855eff9336c","Type":"ContainerStarted","Data":"54b77d192f36d04ae0f5b8e20b04cde6f090a5af9b45a72764603074be8c1e6e"} Oct 06 12:29:12 crc kubenswrapper[4892]: I1006 12:29:12.551651 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-m6vm8" event={"ID":"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a","Type":"ContainerStarted","Data":"204462ecb6e2fd9f11f1f836a1565b41dc9cba1a625f7c36cea3959dde9a8780"} Oct 06 12:29:12 crc kubenswrapper[4892]: I1006 12:29:12.551705 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-m6vm8" event={"ID":"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a","Type":"ContainerStarted","Data":"6310ede416773dedcf549fbca16778ba2896ff6da7bf1e1d589c99e6a09e0872"} Oct 06 12:29:12 crc kubenswrapper[4892]: I1006 12:29:12.945229 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:29:12 crc kubenswrapper[4892]: I1006 12:29:12.977120 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-m6vm8" podStartSLOduration=2.97708701 podStartE2EDuration="2.97708701s" podCreationTimestamp="2025-10-06 12:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:29:12.584711525 +0000 UTC m=+1239.134417300" watchObservedRunningTime="2025-10-06 12:29:12.97708701 +0000 UTC m=+1239.526792775" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.025808 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54849b84c9-6zpxg"] Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.026812 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" podUID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" containerName="dnsmasq-dns" containerID="cri-o://cfad99feca6c74434b5ca451369cb5b5190835929a02390324a9135835bfbb95" gracePeriod=10 Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.568366 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085bae0e-dea7-4bb8-8cf9-3855eff9336c","Type":"ContainerStarted","Data":"df0b99c6eb816a160777c65854913e9996eee02a6c4ed63738e04323e27025ab"} Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.571187 4892 generic.go:334] "Generic (PLEG): container finished" podID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" containerID="cfad99feca6c74434b5ca451369cb5b5190835929a02390324a9135835bfbb95" exitCode=0 Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.571287 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" event={"ID":"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a","Type":"ContainerDied","Data":"cfad99feca6c74434b5ca451369cb5b5190835929a02390324a9135835bfbb95"} Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.571367 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" event={"ID":"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a","Type":"ContainerDied","Data":"935152dc2096409beb91855b8ed2944612c7e2525afb9d9575e8e1fcfdfa1a0a"} Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.571393 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="935152dc2096409beb91855b8ed2944612c7e2525afb9d9575e8e1fcfdfa1a0a" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.593801 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.713244 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-config\") pod \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.713390 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-svc\") pod \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.713436 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-swift-storage-0\") pod \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.713498 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-sb\") pod \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.713560 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-nb\") pod \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.713620 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wchqk\" (UniqueName: \"kubernetes.io/projected/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-kube-api-access-wchqk\") pod \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\" (UID: \"a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a\") " Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.743179 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-kube-api-access-wchqk" (OuterVolumeSpecName: "kube-api-access-wchqk") pod "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" (UID: "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a"). InnerVolumeSpecName "kube-api-access-wchqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.771353 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" (UID: "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.802421 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" (UID: "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.810194 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-config" (OuterVolumeSpecName: "config") pod "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" (UID: "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.815547 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.815579 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.815591 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wchqk\" (UniqueName: \"kubernetes.io/projected/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-kube-api-access-wchqk\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.815603 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.817170 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" (UID: "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.847783 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" (UID: "a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.918076 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:13 crc kubenswrapper[4892]: I1006 12:29:13.918115 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:14 crc kubenswrapper[4892]: I1006 12:29:14.585113 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54849b84c9-6zpxg" Oct 06 12:29:14 crc kubenswrapper[4892]: I1006 12:29:14.586353 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"085bae0e-dea7-4bb8-8cf9-3855eff9336c","Type":"ContainerStarted","Data":"1506dd933a09ffd6ece533800338ef48e3327c2a3876f93fd4dde0895c5f9dc4"} Oct 06 12:29:14 crc kubenswrapper[4892]: I1006 12:29:14.586383 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 06 12:29:14 crc kubenswrapper[4892]: I1006 12:29:14.617422 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.923911229 podStartE2EDuration="4.617403731s" podCreationTimestamp="2025-10-06 12:29:10 +0000 UTC" firstStartedPulling="2025-10-06 12:29:11.385848114 +0000 UTC m=+1237.935553869" lastFinishedPulling="2025-10-06 12:29:14.079340606 +0000 UTC m=+1240.629046371" observedRunningTime="2025-10-06 12:29:14.605695908 +0000 UTC m=+1241.155401673" watchObservedRunningTime="2025-10-06 12:29:14.617403731 +0000 UTC m=+1241.167109486" Oct 06 12:29:14 crc kubenswrapper[4892]: I1006 12:29:14.631916 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54849b84c9-6zpxg"] Oct 06 12:29:14 crc kubenswrapper[4892]: I1006 12:29:14.641985 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54849b84c9-6zpxg"] Oct 06 12:29:16 crc kubenswrapper[4892]: I1006 12:29:16.186830 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" path="/var/lib/kubelet/pods/a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a/volumes" Oct 06 12:29:17 crc kubenswrapper[4892]: I1006 12:29:17.626674 4892 generic.go:334] "Generic (PLEG): container finished" podID="c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" containerID="204462ecb6e2fd9f11f1f836a1565b41dc9cba1a625f7c36cea3959dde9a8780" exitCode=0 Oct 06 12:29:17 crc kubenswrapper[4892]: I1006 12:29:17.626853 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-m6vm8" event={"ID":"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a","Type":"ContainerDied","Data":"204462ecb6e2fd9f11f1f836a1565b41dc9cba1a625f7c36cea3959dde9a8780"} Oct 06 12:29:17 crc kubenswrapper[4892]: I1006 12:29:17.906559 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 12:29:17 crc kubenswrapper[4892]: I1006 12:29:17.906638 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 12:29:18 crc kubenswrapper[4892]: I1006 12:29:18.928732 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 12:29:18 crc kubenswrapper[4892]: I1006 12:29:18.928803 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.054609 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.133649 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-config-data\") pod \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.133699 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-combined-ca-bundle\") pod \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.133732 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-scripts\") pod \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.133793 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxk2n\" (UniqueName: \"kubernetes.io/projected/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-kube-api-access-dxk2n\") pod \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\" (UID: \"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a\") " Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.140610 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-scripts" (OuterVolumeSpecName: "scripts") pod "c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" (UID: "c0f88b14-cdad-4ccd-865b-6f57c82a1a8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.141284 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-kube-api-access-dxk2n" (OuterVolumeSpecName: "kube-api-access-dxk2n") pod "c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" (UID: "c0f88b14-cdad-4ccd-865b-6f57c82a1a8a"). InnerVolumeSpecName "kube-api-access-dxk2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.168511 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" (UID: "c0f88b14-cdad-4ccd-865b-6f57c82a1a8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.185575 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-config-data" (OuterVolumeSpecName: "config-data") pod "c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" (UID: "c0f88b14-cdad-4ccd-865b-6f57c82a1a8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.236581 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxk2n\" (UniqueName: \"kubernetes.io/projected/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-kube-api-access-dxk2n\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.236618 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.236631 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.236641 4892 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a-scripts\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.664240 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-m6vm8" event={"ID":"c0f88b14-cdad-4ccd-865b-6f57c82a1a8a","Type":"ContainerDied","Data":"6310ede416773dedcf549fbca16778ba2896ff6da7bf1e1d589c99e6a09e0872"} Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.664658 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6310ede416773dedcf549fbca16778ba2896ff6da7bf1e1d589c99e6a09e0872" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.664316 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-m6vm8" Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.849771 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.850000 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="40e26ba4-7fe8-4ae2-9839-9775c60c7e90" containerName="nova-scheduler-scheduler" containerID="cri-o://8ce6783a0f879b03f4a2fb9cdc1048d4383acb87ac4baa532a77af16419c5c5d" gracePeriod=30 Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.862702 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.862993 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-log" containerID="cri-o://2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae" gracePeriod=30 Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.863060 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-api" containerID="cri-o://c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43" gracePeriod=30 Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.915084 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.915292 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-log" containerID="cri-o://bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35" gracePeriod=30 Oct 06 12:29:19 crc kubenswrapper[4892]: I1006 12:29:19.915429 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-metadata" containerID="cri-o://a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a" gracePeriod=30 Oct 06 12:29:20 crc kubenswrapper[4892]: I1006 12:29:20.684889 4892 generic.go:334] "Generic (PLEG): container finished" podID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerID="2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae" exitCode=143 Oct 06 12:29:20 crc kubenswrapper[4892]: I1006 12:29:20.684973 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5e3ba83-e6d6-4890-b74a-540e333433a9","Type":"ContainerDied","Data":"2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae"} Oct 06 12:29:20 crc kubenswrapper[4892]: I1006 12:29:20.688105 4892 generic.go:334] "Generic (PLEG): container finished" podID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerID="bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35" exitCode=143 Oct 06 12:29:20 crc kubenswrapper[4892]: I1006 12:29:20.688185 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4","Type":"ContainerDied","Data":"bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35"} Oct 06 12:29:20 crc kubenswrapper[4892]: I1006 12:29:20.690736 4892 generic.go:334] "Generic (PLEG): container finished" podID="40e26ba4-7fe8-4ae2-9839-9775c60c7e90" containerID="8ce6783a0f879b03f4a2fb9cdc1048d4383acb87ac4baa532a77af16419c5c5d" exitCode=0 Oct 06 12:29:20 crc kubenswrapper[4892]: I1006 12:29:20.690766 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40e26ba4-7fe8-4ae2-9839-9775c60c7e90","Type":"ContainerDied","Data":"8ce6783a0f879b03f4a2fb9cdc1048d4383acb87ac4baa532a77af16419c5c5d"} Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.084905 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.179660 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-config-data\") pod \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.179773 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-combined-ca-bundle\") pod \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.179814 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f7fp\" (UniqueName: \"kubernetes.io/projected/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-kube-api-access-9f7fp\") pod \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\" (UID: \"40e26ba4-7fe8-4ae2-9839-9775c60c7e90\") " Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.184962 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-kube-api-access-9f7fp" (OuterVolumeSpecName: "kube-api-access-9f7fp") pod "40e26ba4-7fe8-4ae2-9839-9775c60c7e90" (UID: "40e26ba4-7fe8-4ae2-9839-9775c60c7e90"). InnerVolumeSpecName "kube-api-access-9f7fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.216190 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40e26ba4-7fe8-4ae2-9839-9775c60c7e90" (UID: "40e26ba4-7fe8-4ae2-9839-9775c60c7e90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.219216 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-config-data" (OuterVolumeSpecName: "config-data") pod "40e26ba4-7fe8-4ae2-9839-9775c60c7e90" (UID: "40e26ba4-7fe8-4ae2-9839-9775c60c7e90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.311072 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.311112 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.311124 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f7fp\" (UniqueName: \"kubernetes.io/projected/40e26ba4-7fe8-4ae2-9839-9775c60c7e90-kube-api-access-9f7fp\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.318666 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.412692 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-nova-metadata-tls-certs\") pod \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.412951 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-logs\") pod \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.412999 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trndf\" (UniqueName: \"kubernetes.io/projected/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-kube-api-access-trndf\") pod \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.413171 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-config-data\") pod \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.413215 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-combined-ca-bundle\") pod \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\" (UID: \"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4\") " Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.413395 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-logs" (OuterVolumeSpecName: "logs") pod "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" (UID: "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.413886 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.419483 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-kube-api-access-trndf" (OuterVolumeSpecName: "kube-api-access-trndf") pod "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" (UID: "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4"). InnerVolumeSpecName "kube-api-access-trndf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.442428 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-config-data" (OuterVolumeSpecName: "config-data") pod "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" (UID: "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.452106 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" (UID: "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.465672 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" (UID: "fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.516230 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.516262 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.516274 4892 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.516283 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trndf\" (UniqueName: \"kubernetes.io/projected/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4-kube-api-access-trndf\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.707981 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.708045 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40e26ba4-7fe8-4ae2-9839-9775c60c7e90","Type":"ContainerDied","Data":"60447312dd9768b19712fd444f2e15cff94f8867455fd19f6f9c23e8bd6f23c4"} Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.708419 4892 scope.go:117] "RemoveContainer" containerID="8ce6783a0f879b03f4a2fb9cdc1048d4383acb87ac4baa532a77af16419c5c5d" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.713726 4892 generic.go:334] "Generic (PLEG): container finished" podID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerID="a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a" exitCode=0 Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.713765 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4","Type":"ContainerDied","Data":"a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a"} Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.713861 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.714620 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4","Type":"ContainerDied","Data":"acc3bfb47282a9d1eb78a0d127709d5087c363c5e4c1086bdd8b8a665189b831"} Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.748074 4892 scope.go:117] "RemoveContainer" containerID="a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.773494 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.778049 4892 scope.go:117] "RemoveContainer" containerID="bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.783689 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.792700 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:29:21 crc kubenswrapper[4892]: E1006 12:29:21.794106 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e26ba4-7fe8-4ae2-9839-9775c60c7e90" containerName="nova-scheduler-scheduler" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794132 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e26ba4-7fe8-4ae2-9839-9775c60c7e90" containerName="nova-scheduler-scheduler" Oct 06 12:29:21 crc kubenswrapper[4892]: E1006 12:29:21.794162 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-metadata" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794174 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-metadata" Oct 06 12:29:21 crc kubenswrapper[4892]: E1006 12:29:21.794199 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" containerName="init" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794211 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" containerName="init" Oct 06 12:29:21 crc kubenswrapper[4892]: E1006 12:29:21.794240 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" containerName="nova-manage" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794251 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" containerName="nova-manage" Oct 06 12:29:21 crc kubenswrapper[4892]: E1006 12:29:21.794269 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" containerName="dnsmasq-dns" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794279 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" containerName="dnsmasq-dns" Oct 06 12:29:21 crc kubenswrapper[4892]: E1006 12:29:21.794301 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-log" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794311 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-log" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794622 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" containerName="nova-manage" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794645 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-log" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794664 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e26ba4-7fe8-4ae2-9839-9775c60c7e90" containerName="nova-scheduler-scheduler" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794675 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4737ce6-7463-41ae-8dcf-ab3f3f84cb9a" containerName="dnsmasq-dns" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.794706 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" containerName="nova-metadata-metadata" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.795784 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.806671 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.814564 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.814727 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.818658 4892 scope.go:117] "RemoveContainer" containerID="a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a" Oct 06 12:29:21 crc kubenswrapper[4892]: E1006 12:29:21.820297 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a\": container with ID starting with a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a not found: ID does not exist" containerID="a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.820354 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a"} err="failed to get container status \"a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a\": rpc error: code = NotFound desc = could not find container \"a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a\": container with ID starting with a4b0a6e60568ac95270e809e0e331e8cbd1bfa146fb2c05851c7ddb9bced2d4a not found: ID does not exist" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.820383 4892 scope.go:117] "RemoveContainer" containerID="bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35" Oct 06 12:29:21 crc kubenswrapper[4892]: E1006 12:29:21.821063 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35\": container with ID starting with bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35 not found: ID does not exist" containerID="bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.821097 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35"} err="failed to get container status \"bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35\": rpc error: code = NotFound desc = could not find container \"bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35\": container with ID starting with bd33e6b48f54795ac4ea85fbe6ececb00bb66252438ce18e846a6dd44582ff35 not found: ID does not exist" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.829056 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.840017 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.841843 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.847303 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.847922 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.849129 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.931645 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5354ced-b54e-4a88-934b-7bebbacccf1e-logs\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.931702 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cln7\" (UniqueName: \"kubernetes.io/projected/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-kube-api-access-2cln7\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.931773 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.931802 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.931906 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwjx8\" (UniqueName: \"kubernetes.io/projected/a5354ced-b54e-4a88-934b-7bebbacccf1e-kube-api-access-pwjx8\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.931977 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-config-data\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.932018 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-config-data\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:21 crc kubenswrapper[4892]: I1006 12:29:21.932154 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.033990 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.034086 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5354ced-b54e-4a88-934b-7bebbacccf1e-logs\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.034122 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cln7\" (UniqueName: \"kubernetes.io/projected/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-kube-api-access-2cln7\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.034186 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.034210 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.034243 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwjx8\" (UniqueName: \"kubernetes.io/projected/a5354ced-b54e-4a88-934b-7bebbacccf1e-kube-api-access-pwjx8\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.034275 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-config-data\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.034297 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-config-data\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.034821 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5354ced-b54e-4a88-934b-7bebbacccf1e-logs\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.038762 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-config-data\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.038763 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.040924 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-config-data\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.041125 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5354ced-b54e-4a88-934b-7bebbacccf1e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.047211 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.049002 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cln7\" (UniqueName: \"kubernetes.io/projected/c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c-kube-api-access-2cln7\") pod \"nova-scheduler-0\" (UID: \"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c\") " pod="openstack/nova-scheduler-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.056559 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwjx8\" (UniqueName: \"kubernetes.io/projected/a5354ced-b54e-4a88-934b-7bebbacccf1e-kube-api-access-pwjx8\") pod \"nova-metadata-0\" (UID: \"a5354ced-b54e-4a88-934b-7bebbacccf1e\") " pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.141243 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.166736 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.198562 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e26ba4-7fe8-4ae2-9839-9775c60c7e90" path="/var/lib/kubelet/pods/40e26ba4-7fe8-4ae2-9839-9775c60c7e90/volumes" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.199298 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4" path="/var/lib/kubelet/pods/fe0f2281-ad9f-44fb-833b-5fdc8dc53ac4/volumes" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.341881 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.459688 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-public-tls-certs\") pod \"b5e3ba83-e6d6-4890-b74a-540e333433a9\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.459768 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-internal-tls-certs\") pod \"b5e3ba83-e6d6-4890-b74a-540e333433a9\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.459838 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e3ba83-e6d6-4890-b74a-540e333433a9-logs\") pod \"b5e3ba83-e6d6-4890-b74a-540e333433a9\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.460020 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-config-data\") pod \"b5e3ba83-e6d6-4890-b74a-540e333433a9\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.460071 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9bgx\" (UniqueName: \"kubernetes.io/projected/b5e3ba83-e6d6-4890-b74a-540e333433a9-kube-api-access-l9bgx\") pod \"b5e3ba83-e6d6-4890-b74a-540e333433a9\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.460089 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-combined-ca-bundle\") pod \"b5e3ba83-e6d6-4890-b74a-540e333433a9\" (UID: \"b5e3ba83-e6d6-4890-b74a-540e333433a9\") " Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.461056 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5e3ba83-e6d6-4890-b74a-540e333433a9-logs" (OuterVolumeSpecName: "logs") pod "b5e3ba83-e6d6-4890-b74a-540e333433a9" (UID: "b5e3ba83-e6d6-4890-b74a-540e333433a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.464521 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e3ba83-e6d6-4890-b74a-540e333433a9-kube-api-access-l9bgx" (OuterVolumeSpecName: "kube-api-access-l9bgx") pod "b5e3ba83-e6d6-4890-b74a-540e333433a9" (UID: "b5e3ba83-e6d6-4890-b74a-540e333433a9"). InnerVolumeSpecName "kube-api-access-l9bgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.488536 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b5e3ba83-e6d6-4890-b74a-540e333433a9" (UID: "b5e3ba83-e6d6-4890-b74a-540e333433a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.490249 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-config-data" (OuterVolumeSpecName: "config-data") pod "b5e3ba83-e6d6-4890-b74a-540e333433a9" (UID: "b5e3ba83-e6d6-4890-b74a-540e333433a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.509314 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b5e3ba83-e6d6-4890-b74a-540e333433a9" (UID: "b5e3ba83-e6d6-4890-b74a-540e333433a9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.509635 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b5e3ba83-e6d6-4890-b74a-540e333433a9" (UID: "b5e3ba83-e6d6-4890-b74a-540e333433a9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.562183 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.562214 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9bgx\" (UniqueName: \"kubernetes.io/projected/b5e3ba83-e6d6-4890-b74a-540e333433a9-kube-api-access-l9bgx\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.562227 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.562235 4892 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.562245 4892 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5e3ba83-e6d6-4890-b74a-540e333433a9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.562255 4892 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5e3ba83-e6d6-4890-b74a-540e333433a9-logs\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.662280 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 06 12:29:22 crc kubenswrapper[4892]: W1006 12:29:22.670229 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1f22bc9_62b7_4c4e_b794_83a9f1a54f3c.slice/crio-b1896b5b57a4367f6d429521de197e9864bedb78a82406b70fe6bac94cc007b1 WatchSource:0}: Error finding container b1896b5b57a4367f6d429521de197e9864bedb78a82406b70fe6bac94cc007b1: Status 404 returned error can't find the container with id b1896b5b57a4367f6d429521de197e9864bedb78a82406b70fe6bac94cc007b1 Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.681405 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 06 12:29:22 crc kubenswrapper[4892]: W1006 12:29:22.691738 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5354ced_b54e_4a88_934b_7bebbacccf1e.slice/crio-4b6f24f47bb4a06bf52eee91c4e6481a17c0b66b0636d903dbcf16bcf2020443 WatchSource:0}: Error finding container 4b6f24f47bb4a06bf52eee91c4e6481a17c0b66b0636d903dbcf16bcf2020443: Status 404 returned error can't find the container with id 4b6f24f47bb4a06bf52eee91c4e6481a17c0b66b0636d903dbcf16bcf2020443 Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.732785 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c","Type":"ContainerStarted","Data":"b1896b5b57a4367f6d429521de197e9864bedb78a82406b70fe6bac94cc007b1"} Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.735406 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5354ced-b54e-4a88-934b-7bebbacccf1e","Type":"ContainerStarted","Data":"4b6f24f47bb4a06bf52eee91c4e6481a17c0b66b0636d903dbcf16bcf2020443"} Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.739754 4892 generic.go:334] "Generic (PLEG): container finished" podID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerID="c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43" exitCode=0 Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.739873 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.739931 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5e3ba83-e6d6-4890-b74a-540e333433a9","Type":"ContainerDied","Data":"c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43"} Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.739991 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b5e3ba83-e6d6-4890-b74a-540e333433a9","Type":"ContainerDied","Data":"3951acae36bf8f7455fa6ae0fdd7294bfb1185da52247af09d4a586ff588ea7c"} Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.740025 4892 scope.go:117] "RemoveContainer" containerID="c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.772178 4892 scope.go:117] "RemoveContainer" containerID="2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.787973 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.818097 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.829689 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:22 crc kubenswrapper[4892]: E1006 12:29:22.830172 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-api" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.830195 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-api" Oct 06 12:29:22 crc kubenswrapper[4892]: E1006 12:29:22.830228 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-log" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.830237 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-log" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.830499 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-api" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.830522 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" containerName="nova-api-log" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.831870 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.834411 4892 scope.go:117] "RemoveContainer" containerID="c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.835882 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.836606 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.837816 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 06 12:29:22 crc kubenswrapper[4892]: E1006 12:29:22.840056 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43\": container with ID starting with c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43 not found: ID does not exist" containerID="c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.840172 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43"} err="failed to get container status \"c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43\": rpc error: code = NotFound desc = could not find container \"c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43\": container with ID starting with c62303eace48dfed9d5f9394d64e24048e84d3d5e2c18d32085b3c53a7d1ca43 not found: ID does not exist" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.840266 4892 scope.go:117] "RemoveContainer" containerID="2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.843769 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:22 crc kubenswrapper[4892]: E1006 12:29:22.843773 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae\": container with ID starting with 2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae not found: ID does not exist" containerID="2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.843841 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae"} err="failed to get container status \"2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae\": rpc error: code = NotFound desc = could not find container \"2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae\": container with ID starting with 2597604a02283ae3dd1ed54943614bec7e6046abe6f32bd0b8e4d3d4180118ae not found: ID does not exist" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.972903 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.973203 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-public-tls-certs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.973251 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87347f7-4f43-42da-9a0d-59fca9e193c5-logs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.973271 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-config-data\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.973300 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2gpg\" (UniqueName: \"kubernetes.io/projected/b87347f7-4f43-42da-9a0d-59fca9e193c5-kube-api-access-f2gpg\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.973521 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.984228 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:29:22 crc kubenswrapper[4892]: I1006 12:29:22.984290 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.075085 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2gpg\" (UniqueName: \"kubernetes.io/projected/b87347f7-4f43-42da-9a0d-59fca9e193c5-kube-api-access-f2gpg\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.075171 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.075243 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.075295 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-public-tls-certs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.075340 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87347f7-4f43-42da-9a0d-59fca9e193c5-logs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.075358 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-config-data\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.076838 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b87347f7-4f43-42da-9a0d-59fca9e193c5-logs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.082973 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.083452 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-public-tls-certs\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.084184 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-config-data\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.086571 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87347f7-4f43-42da-9a0d-59fca9e193c5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.099190 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2gpg\" (UniqueName: \"kubernetes.io/projected/b87347f7-4f43-42da-9a0d-59fca9e193c5-kube-api-access-f2gpg\") pod \"nova-api-0\" (UID: \"b87347f7-4f43-42da-9a0d-59fca9e193c5\") " pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.160515 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.730096 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 06 12:29:23 crc kubenswrapper[4892]: W1006 12:29:23.735929 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb87347f7_4f43_42da_9a0d_59fca9e193c5.slice/crio-cdba4c59a23a0405601c751326b8f7c4c79a0e57b5a3a4143d128d3264d311d0 WatchSource:0}: Error finding container cdba4c59a23a0405601c751326b8f7c4c79a0e57b5a3a4143d128d3264d311d0: Status 404 returned error can't find the container with id cdba4c59a23a0405601c751326b8f7c4c79a0e57b5a3a4143d128d3264d311d0 Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.760464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5354ced-b54e-4a88-934b-7bebbacccf1e","Type":"ContainerStarted","Data":"fa323bb970a42e14e8faa72740501499d8f9a3b386a7ce9ff106d9e67018a1c2"} Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.760512 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5354ced-b54e-4a88-934b-7bebbacccf1e","Type":"ContainerStarted","Data":"480c2af88a2b6078cb0b43fd9205c08feb12926617509c9eeaaacfcc5b0969c3"} Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.775032 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c","Type":"ContainerStarted","Data":"d4bf3227a32a19ccb01bd4adb558c0a464cc69dd58b7b6491c00266768b12ad2"} Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.781401 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b87347f7-4f43-42da-9a0d-59fca9e193c5","Type":"ContainerStarted","Data":"cdba4c59a23a0405601c751326b8f7c4c79a0e57b5a3a4143d128d3264d311d0"} Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.786004 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.785984288 podStartE2EDuration="2.785984288s" podCreationTimestamp="2025-10-06 12:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:29:23.780181409 +0000 UTC m=+1250.329887174" watchObservedRunningTime="2025-10-06 12:29:23.785984288 +0000 UTC m=+1250.335690064" Oct 06 12:29:23 crc kubenswrapper[4892]: I1006 12:29:23.800858 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.8008419030000002 podStartE2EDuration="2.800841903s" podCreationTimestamp="2025-10-06 12:29:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:29:23.797291989 +0000 UTC m=+1250.346997754" watchObservedRunningTime="2025-10-06 12:29:23.800841903 +0000 UTC m=+1250.350547658" Oct 06 12:29:24 crc kubenswrapper[4892]: I1006 12:29:24.215620 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e3ba83-e6d6-4890-b74a-540e333433a9" path="/var/lib/kubelet/pods/b5e3ba83-e6d6-4890-b74a-540e333433a9/volumes" Oct 06 12:29:24 crc kubenswrapper[4892]: I1006 12:29:24.795493 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b87347f7-4f43-42da-9a0d-59fca9e193c5","Type":"ContainerStarted","Data":"6b7963280ee9351bf14e863a5a03b278a8981c666c96d245a33f9b4a984d6655"} Oct 06 12:29:24 crc kubenswrapper[4892]: I1006 12:29:24.795795 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b87347f7-4f43-42da-9a0d-59fca9e193c5","Type":"ContainerStarted","Data":"fc14e6d2c5965e30e31d4c322308ce1b41840db38c2ccc150082367988762ba7"} Oct 06 12:29:24 crc kubenswrapper[4892]: I1006 12:29:24.816850 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.816832906 podStartE2EDuration="2.816832906s" podCreationTimestamp="2025-10-06 12:29:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:29:24.814215499 +0000 UTC m=+1251.363921284" watchObservedRunningTime="2025-10-06 12:29:24.816832906 +0000 UTC m=+1251.366538671" Oct 06 12:29:27 crc kubenswrapper[4892]: I1006 12:29:27.142727 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 06 12:29:27 crc kubenswrapper[4892]: I1006 12:29:27.167868 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 12:29:27 crc kubenswrapper[4892]: I1006 12:29:27.167965 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 06 12:29:32 crc kubenswrapper[4892]: I1006 12:29:32.142977 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 06 12:29:32 crc kubenswrapper[4892]: I1006 12:29:32.167616 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 12:29:32 crc kubenswrapper[4892]: I1006 12:29:32.168890 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 06 12:29:32 crc kubenswrapper[4892]: I1006 12:29:32.199480 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 06 12:29:32 crc kubenswrapper[4892]: I1006 12:29:32.949656 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 06 12:29:33 crc kubenswrapper[4892]: I1006 12:29:33.161537 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 12:29:33 crc kubenswrapper[4892]: I1006 12:29:33.161614 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 06 12:29:33 crc kubenswrapper[4892]: I1006 12:29:33.185626 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a5354ced-b54e-4a88-934b-7bebbacccf1e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 12:29:33 crc kubenswrapper[4892]: I1006 12:29:33.185649 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a5354ced-b54e-4a88-934b-7bebbacccf1e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.229:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 12:29:34 crc kubenswrapper[4892]: I1006 12:29:34.210566 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b87347f7-4f43-42da-9a0d-59fca9e193c5" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 12:29:34 crc kubenswrapper[4892]: I1006 12:29:34.211123 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b87347f7-4f43-42da-9a0d-59fca9e193c5" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.230:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 06 12:29:40 crc kubenswrapper[4892]: I1006 12:29:40.924840 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 06 12:29:42 crc kubenswrapper[4892]: I1006 12:29:42.190365 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 12:29:42 crc kubenswrapper[4892]: I1006 12:29:42.190515 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 06 12:29:42 crc kubenswrapper[4892]: I1006 12:29:42.199192 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 12:29:42 crc kubenswrapper[4892]: I1006 12:29:42.200314 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 06 12:29:43 crc kubenswrapper[4892]: I1006 12:29:43.175129 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 12:29:43 crc kubenswrapper[4892]: I1006 12:29:43.175783 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 12:29:43 crc kubenswrapper[4892]: I1006 12:29:43.177761 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 06 12:29:43 crc kubenswrapper[4892]: I1006 12:29:43.187978 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 12:29:44 crc kubenswrapper[4892]: I1006 12:29:44.047477 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 06 12:29:44 crc kubenswrapper[4892]: I1006 12:29:44.060017 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 06 12:29:52 crc kubenswrapper[4892]: I1006 12:29:52.637408 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:29:52 crc kubenswrapper[4892]: I1006 12:29:52.984165 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:29:52 crc kubenswrapper[4892]: I1006 12:29:52.984268 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:29:52 crc kubenswrapper[4892]: I1006 12:29:52.984394 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:29:52 crc kubenswrapper[4892]: I1006 12:29:52.985773 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"860dd81af7b9279e259a2bd7600f304a9fac68884adcaaf5b381f360c68fdea5"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:29:52 crc kubenswrapper[4892]: I1006 12:29:52.985924 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://860dd81af7b9279e259a2bd7600f304a9fac68884adcaaf5b381f360c68fdea5" gracePeriod=600 Oct 06 12:29:53 crc kubenswrapper[4892]: I1006 12:29:53.156176 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="860dd81af7b9279e259a2bd7600f304a9fac68884adcaaf5b381f360c68fdea5" exitCode=0 Oct 06 12:29:53 crc kubenswrapper[4892]: I1006 12:29:53.156220 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"860dd81af7b9279e259a2bd7600f304a9fac68884adcaaf5b381f360c68fdea5"} Oct 06 12:29:53 crc kubenswrapper[4892]: I1006 12:29:53.156253 4892 scope.go:117] "RemoveContainer" containerID="f99cab5f831d4479bae318ede8be6239cf73affb4f0ae80b3e22b31bc2f59223" Oct 06 12:29:53 crc kubenswrapper[4892]: I1006 12:29:53.520857 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:29:54 crc kubenswrapper[4892]: I1006 12:29:54.166908 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"8e6bec4311317cf3d786aab7279e92bdb6ecd5789603229094ff6446f6367943"} Oct 06 12:29:56 crc kubenswrapper[4892]: I1006 12:29:56.026370 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerName="rabbitmq" containerID="cri-o://675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263" gracePeriod=604797 Oct 06 12:29:56 crc kubenswrapper[4892]: I1006 12:29:56.696755 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerName="rabbitmq" containerID="cri-o://48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4" gracePeriod=604797 Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.691850 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859273 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-pod-info\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859373 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-config-data\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859413 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-plugins\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859448 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-confd\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859510 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-tls\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859559 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-server-conf\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859632 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-erlang-cookie-secret\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859785 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859828 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-erlang-cookie\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859854 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zlt6\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-kube-api-access-9zlt6\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.859891 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-plugins-conf\") pod \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\" (UID: \"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c\") " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.860992 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.861282 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.861708 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.864976 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-pod-info" (OuterVolumeSpecName: "pod-info") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.865409 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-kube-api-access-9zlt6" (OuterVolumeSpecName: "kube-api-access-9zlt6") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "kube-api-access-9zlt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.865555 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.869931 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.870672 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.942829 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-server-conf" (OuterVolumeSpecName: "server-conf") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.961038 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-config-data" (OuterVolumeSpecName: "config-data") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.962432 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963646 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963687 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zlt6\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-kube-api-access-9zlt6\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963697 4892 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963707 4892 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963720 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963731 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963740 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963747 4892 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.963756 4892 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:57 crc kubenswrapper[4892]: I1006 12:29:57.987996 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.065995 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.067027 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" (UID: "cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.173197 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.213867 4892 generic.go:334] "Generic (PLEG): container finished" podID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerID="675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263" exitCode=0 Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.213918 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c","Type":"ContainerDied","Data":"675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263"} Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.213944 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c","Type":"ContainerDied","Data":"9e6a7a5aea9cb8a9e54ec08bbcb82a74889370b873a5a0ee7c37083195a97d25"} Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.213962 4892 scope.go:117] "RemoveContainer" containerID="675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.214083 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.218529 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.226431 4892 generic.go:334] "Generic (PLEG): container finished" podID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerID="48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4" exitCode=0 Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.226478 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bcc90cdb-7f84-4923-9eef-4fae34199b75","Type":"ContainerDied","Data":"48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4"} Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.226504 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bcc90cdb-7f84-4923-9eef-4fae34199b75","Type":"ContainerDied","Data":"43cf2e3715dba1961e5b0ab735063d4634df1096410b410d099389c3c38012a3"} Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.253421 4892 scope.go:117] "RemoveContainer" containerID="f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281224 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-plugins-conf\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281579 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-erlang-cookie\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281613 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-config-data\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281644 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bcc90cdb-7f84-4923-9eef-4fae34199b75-pod-info\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281726 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bcc90cdb-7f84-4923-9eef-4fae34199b75-erlang-cookie-secret\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281752 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281796 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx5pk\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-kube-api-access-gx5pk\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281853 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-tls\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281920 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-server-conf\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.281971 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-plugins\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.282004 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-confd\") pod \"bcc90cdb-7f84-4923-9eef-4fae34199b75\" (UID: \"bcc90cdb-7f84-4923-9eef-4fae34199b75\") " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.284277 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.284724 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.289939 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.289975 4892 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.291213 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.291587 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.292733 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc90cdb-7f84-4923-9eef-4fae34199b75-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.296049 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-kube-api-access-gx5pk" (OuterVolumeSpecName: "kube-api-access-gx5pk") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "kube-api-access-gx5pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.303579 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.327244 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bcc90cdb-7f84-4923-9eef-4fae34199b75-pod-info" (OuterVolumeSpecName: "pod-info") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.337452 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.351585 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-config-data" (OuterVolumeSpecName: "config-data") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.361505 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.372526 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:29:58 crc kubenswrapper[4892]: E1006 12:29:58.373268 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerName="setup-container" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.373290 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerName="setup-container" Oct 06 12:29:58 crc kubenswrapper[4892]: E1006 12:29:58.373307 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerName="setup-container" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.373316 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerName="setup-container" Oct 06 12:29:58 crc kubenswrapper[4892]: E1006 12:29:58.373373 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerName="rabbitmq" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.373382 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerName="rabbitmq" Oct 06 12:29:58 crc kubenswrapper[4892]: E1006 12:29:58.373427 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerName="rabbitmq" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.373438 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerName="rabbitmq" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.373910 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" containerName="rabbitmq" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.373945 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc90cdb-7f84-4923-9eef-4fae34199b75" containerName="rabbitmq" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.377143 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.381263 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.381795 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.382062 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.382261 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mrf4b" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.382549 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.382723 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.382906 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.395801 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.395853 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.395867 4892 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bcc90cdb-7f84-4923-9eef-4fae34199b75-pod-info\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.395880 4892 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bcc90cdb-7f84-4923-9eef-4fae34199b75-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.395927 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.395939 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx5pk\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-kube-api-access-gx5pk\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.395951 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.402751 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.411712 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-server-conf" (OuterVolumeSpecName: "server-conf") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.415270 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bcc90cdb-7f84-4923-9eef-4fae34199b75" (UID: "bcc90cdb-7f84-4923-9eef-4fae34199b75"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.436101 4892 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.474163 4892 scope.go:117] "RemoveContainer" containerID="675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263" Oct 06 12:29:58 crc kubenswrapper[4892]: E1006 12:29:58.474588 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263\": container with ID starting with 675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263 not found: ID does not exist" containerID="675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.474621 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263"} err="failed to get container status \"675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263\": rpc error: code = NotFound desc = could not find container \"675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263\": container with ID starting with 675b2bee0167c50fb147fcd4338c6e357e7a452d09052735aa5c4e4de0145263 not found: ID does not exist" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.474641 4892 scope.go:117] "RemoveContainer" containerID="f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076" Oct 06 12:29:58 crc kubenswrapper[4892]: E1006 12:29:58.475545 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076\": container with ID starting with f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076 not found: ID does not exist" containerID="f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.475588 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076"} err="failed to get container status \"f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076\": rpc error: code = NotFound desc = could not find container \"f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076\": container with ID starting with f2661f3c02960664bf5ab12d4f03f5bea1f12f036b6125c77d3fdaf11393a076 not found: ID does not exist" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.475620 4892 scope.go:117] "RemoveContainer" containerID="48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.497310 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-config-data\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.497390 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9qm\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-kube-api-access-ml9qm\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.497646 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.497792 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b11ee41a-0493-4955-b081-d78b83730ec4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.497821 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b11ee41a-0493-4955-b081-d78b83730ec4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.497863 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.497895 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.498992 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.499038 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.499092 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.499118 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.499279 4892 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bcc90cdb-7f84-4923-9eef-4fae34199b75-server-conf\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.499299 4892 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bcc90cdb-7f84-4923-9eef-4fae34199b75-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.499311 4892 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.517251 4892 scope.go:117] "RemoveContainer" containerID="7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.564789 4892 scope.go:117] "RemoveContainer" containerID="48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4" Oct 06 12:29:58 crc kubenswrapper[4892]: E1006 12:29:58.567560 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4\": container with ID starting with 48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4 not found: ID does not exist" containerID="48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.567614 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4"} err="failed to get container status \"48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4\": rpc error: code = NotFound desc = could not find container \"48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4\": container with ID starting with 48a5463af32ef287cb9b6e288edbab4b1e83be1a5c8fba3ff21cb3e9d4a3b7f4 not found: ID does not exist" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.567644 4892 scope.go:117] "RemoveContainer" containerID="7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2" Oct 06 12:29:58 crc kubenswrapper[4892]: E1006 12:29:58.567965 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2\": container with ID starting with 7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2 not found: ID does not exist" containerID="7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.568056 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2"} err="failed to get container status \"7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2\": rpc error: code = NotFound desc = could not find container \"7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2\": container with ID starting with 7a847b61333ca99395fe932fa2877c5ae9ed58b6857f008ebb5916eb6d5ea0f2 not found: ID does not exist" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.601478 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.601837 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.601951 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.602033 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.602153 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-config-data\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.602249 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9qm\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-kube-api-access-ml9qm\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.601699 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.602791 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.603423 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-config-data\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.603567 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.603848 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b11ee41a-0493-4955-b081-d78b83730ec4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.603893 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b11ee41a-0493-4955-b081-d78b83730ec4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.603924 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.603951 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.604454 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.606258 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.607301 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b11ee41a-0493-4955-b081-d78b83730ec4-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.615644 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.616887 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.617676 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b11ee41a-0493-4955-b081-d78b83730ec4-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.619083 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9qm\" (UniqueName: \"kubernetes.io/projected/b11ee41a-0493-4955-b081-d78b83730ec4-kube-api-access-ml9qm\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.627012 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b11ee41a-0493-4955-b081-d78b83730ec4-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.660991 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-server-0\" (UID: \"b11ee41a-0493-4955-b081-d78b83730ec4\") " pod="openstack/rabbitmq-server-0" Oct 06 12:29:58 crc kubenswrapper[4892]: I1006 12:29:58.805379 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.236489 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.315294 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.442027 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.449884 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.473865 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.475967 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.486939 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.487203 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.487463 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.487561 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.487909 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.488208 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-zllld" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.494615 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.513957 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.623663 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ea9e651-a19b-445b-96dc-fd25c0df95f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.623916 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.624052 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.624193 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.624361 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.624470 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.624568 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r58t\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-kube-api-access-7r58t\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.624682 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ea9e651-a19b-445b-96dc-fd25c0df95f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.624802 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.624921 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.625016 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.727090 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.727592 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.727729 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r58t\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-kube-api-access-7r58t\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.727907 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ea9e651-a19b-445b-96dc-fd25c0df95f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.728291 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.728580 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.728716 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.729272 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ea9e651-a19b-445b-96dc-fd25c0df95f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.729457 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.730006 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.730179 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.731890 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.732268 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.732487 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.732705 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ea9e651-a19b-445b-96dc-fd25c0df95f2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.734246 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.734529 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.735966 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ea9e651-a19b-445b-96dc-fd25c0df95f2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.735970 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.736065 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ea9e651-a19b-445b-96dc-fd25c0df95f2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.737155 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.754529 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r58t\" (UniqueName: \"kubernetes.io/projected/1ea9e651-a19b-445b-96dc-fd25c0df95f2-kube-api-access-7r58t\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.790122 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ea9e651-a19b-445b-96dc-fd25c0df95f2\") " pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:29:59 crc kubenswrapper[4892]: I1006 12:29:59.826600 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.154903 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f"] Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.156703 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.160307 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.160399 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.187123 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc90cdb-7f84-4923-9eef-4fae34199b75" path="/var/lib/kubelet/pods/bcc90cdb-7f84-4923-9eef-4fae34199b75/volumes" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.188593 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c" path="/var/lib/kubelet/pods/cbae4d7a-3312-45b6-8af2-dd9e7ae8bf4c/volumes" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.189551 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f"] Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.272660 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.272994 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b11ee41a-0493-4955-b081-d78b83730ec4","Type":"ContainerStarted","Data":"ed74187ea74d3acd6ff94451de8dc03a43b47378329d81a300046b5fe57b590c"} Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.344013 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbrs2\" (UniqueName: \"kubernetes.io/projected/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-kube-api-access-tbrs2\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.344066 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-config-volume\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.344172 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-secret-volume\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.446161 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbrs2\" (UniqueName: \"kubernetes.io/projected/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-kube-api-access-tbrs2\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.446211 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-config-volume\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.446271 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-secret-volume\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.447483 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-config-volume\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.453128 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-secret-volume\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.461868 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbrs2\" (UniqueName: \"kubernetes.io/projected/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-kube-api-access-tbrs2\") pod \"collect-profiles-29329230-2db6f\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.484814 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:00 crc kubenswrapper[4892]: I1006 12:30:00.919115 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f"] Oct 06 12:30:00 crc kubenswrapper[4892]: W1006 12:30:00.921745 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf18611_f7e0_4b32_9ef1_4695b6e28af3.slice/crio-e73b0f1c7bc9b5599a020db230ecf355125e2ded73a6ef9f40b076ef1fc594d8 WatchSource:0}: Error finding container e73b0f1c7bc9b5599a020db230ecf355125e2ded73a6ef9f40b076ef1fc594d8: Status 404 returned error can't find the container with id e73b0f1c7bc9b5599a020db230ecf355125e2ded73a6ef9f40b076ef1fc594d8 Oct 06 12:30:01 crc kubenswrapper[4892]: I1006 12:30:01.287609 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b11ee41a-0493-4955-b081-d78b83730ec4","Type":"ContainerStarted","Data":"761b69eba7deeb557eedb09958ca9ee3d7beb0205bc05bd12b37ed21dd223650"} Oct 06 12:30:01 crc kubenswrapper[4892]: I1006 12:30:01.290087 4892 generic.go:334] "Generic (PLEG): container finished" podID="5bf18611-f7e0-4b32-9ef1-4695b6e28af3" containerID="9010a4dbf5b36a066fcac4dd65e36a415abaf6f044df5a5b865eb6b6b3ea307f" exitCode=0 Oct 06 12:30:01 crc kubenswrapper[4892]: I1006 12:30:01.290157 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" event={"ID":"5bf18611-f7e0-4b32-9ef1-4695b6e28af3","Type":"ContainerDied","Data":"9010a4dbf5b36a066fcac4dd65e36a415abaf6f044df5a5b865eb6b6b3ea307f"} Oct 06 12:30:01 crc kubenswrapper[4892]: I1006 12:30:01.290183 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" event={"ID":"5bf18611-f7e0-4b32-9ef1-4695b6e28af3","Type":"ContainerStarted","Data":"e73b0f1c7bc9b5599a020db230ecf355125e2ded73a6ef9f40b076ef1fc594d8"} Oct 06 12:30:01 crc kubenswrapper[4892]: I1006 12:30:01.292718 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ea9e651-a19b-445b-96dc-fd25c0df95f2","Type":"ContainerStarted","Data":"69936077f083fb5fbf710c7bee63061db684159db10a5a161b7c561e5fbad483"} Oct 06 12:30:02 crc kubenswrapper[4892]: I1006 12:30:02.782511 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:02 crc kubenswrapper[4892]: I1006 12:30:02.897285 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbrs2\" (UniqueName: \"kubernetes.io/projected/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-kube-api-access-tbrs2\") pod \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " Oct 06 12:30:02 crc kubenswrapper[4892]: I1006 12:30:02.897541 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-config-volume\") pod \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " Oct 06 12:30:02 crc kubenswrapper[4892]: I1006 12:30:02.897582 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-secret-volume\") pod \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\" (UID: \"5bf18611-f7e0-4b32-9ef1-4695b6e28af3\") " Oct 06 12:30:02 crc kubenswrapper[4892]: I1006 12:30:02.898165 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-config-volume" (OuterVolumeSpecName: "config-volume") pod "5bf18611-f7e0-4b32-9ef1-4695b6e28af3" (UID: "5bf18611-f7e0-4b32-9ef1-4695b6e28af3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:02 crc kubenswrapper[4892]: I1006 12:30:02.903573 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-kube-api-access-tbrs2" (OuterVolumeSpecName: "kube-api-access-tbrs2") pod "5bf18611-f7e0-4b32-9ef1-4695b6e28af3" (UID: "5bf18611-f7e0-4b32-9ef1-4695b6e28af3"). InnerVolumeSpecName "kube-api-access-tbrs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:30:02 crc kubenswrapper[4892]: I1006 12:30:02.903780 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5bf18611-f7e0-4b32-9ef1-4695b6e28af3" (UID: "5bf18611-f7e0-4b32-9ef1-4695b6e28af3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:30:03 crc kubenswrapper[4892]: I1006 12:30:02.999920 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:03 crc kubenswrapper[4892]: I1006 12:30:03.000255 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:03 crc kubenswrapper[4892]: I1006 12:30:03.000275 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbrs2\" (UniqueName: \"kubernetes.io/projected/5bf18611-f7e0-4b32-9ef1-4695b6e28af3-kube-api-access-tbrs2\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:03 crc kubenswrapper[4892]: I1006 12:30:03.320675 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" event={"ID":"5bf18611-f7e0-4b32-9ef1-4695b6e28af3","Type":"ContainerDied","Data":"e73b0f1c7bc9b5599a020db230ecf355125e2ded73a6ef9f40b076ef1fc594d8"} Oct 06 12:30:03 crc kubenswrapper[4892]: I1006 12:30:03.320751 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73b0f1c7bc9b5599a020db230ecf355125e2ded73a6ef9f40b076ef1fc594d8" Oct 06 12:30:03 crc kubenswrapper[4892]: I1006 12:30:03.320695 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f" Oct 06 12:30:03 crc kubenswrapper[4892]: I1006 12:30:03.323207 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ea9e651-a19b-445b-96dc-fd25c0df95f2","Type":"ContainerStarted","Data":"b8e4e8e4687fd0186f78884c3367954c99dffb220fafd23f5549eb4492f4c701"} Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.507895 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56cb9bbdf5-wdkmt"] Oct 06 12:30:08 crc kubenswrapper[4892]: E1006 12:30:08.509030 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf18611-f7e0-4b32-9ef1-4695b6e28af3" containerName="collect-profiles" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.509044 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf18611-f7e0-4b32-9ef1-4695b6e28af3" containerName="collect-profiles" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.509220 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf18611-f7e0-4b32-9ef1-4695b6e28af3" containerName="collect-profiles" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.512001 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.514898 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.540531 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56cb9bbdf5-wdkmt"] Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.630588 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-nb\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.630643 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-openstack-edpm-ipam\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.630721 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-sb\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.630799 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf7jg\" (UniqueName: \"kubernetes.io/projected/8108fb91-c94e-4419-b6cd-ceb918071885-kube-api-access-rf7jg\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.630900 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-svc\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.630962 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-config\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.631003 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-swift-storage-0\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.732943 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-sb\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.733068 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf7jg\" (UniqueName: \"kubernetes.io/projected/8108fb91-c94e-4419-b6cd-ceb918071885-kube-api-access-rf7jg\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.733099 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-svc\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.733125 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-config\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.733143 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-swift-storage-0\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.733209 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-nb\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.733228 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-openstack-edpm-ipam\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.733845 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-sb\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.734053 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-openstack-edpm-ipam\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.734280 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-swift-storage-0\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.734615 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-nb\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.734627 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-svc\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.734916 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-config\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.751466 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf7jg\" (UniqueName: \"kubernetes.io/projected/8108fb91-c94e-4419-b6cd-ceb918071885-kube-api-access-rf7jg\") pod \"dnsmasq-dns-56cb9bbdf5-wdkmt\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:08 crc kubenswrapper[4892]: I1006 12:30:08.837535 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:09 crc kubenswrapper[4892]: I1006 12:30:09.314742 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56cb9bbdf5-wdkmt"] Oct 06 12:30:09 crc kubenswrapper[4892]: I1006 12:30:09.419117 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" event={"ID":"8108fb91-c94e-4419-b6cd-ceb918071885","Type":"ContainerStarted","Data":"b018a54e95186864f17c497942a6c222662e82e9a5d900a037f2f5bfa6832e94"} Oct 06 12:30:10 crc kubenswrapper[4892]: I1006 12:30:10.435990 4892 generic.go:334] "Generic (PLEG): container finished" podID="8108fb91-c94e-4419-b6cd-ceb918071885" containerID="fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8" exitCode=0 Oct 06 12:30:10 crc kubenswrapper[4892]: I1006 12:30:10.436117 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" event={"ID":"8108fb91-c94e-4419-b6cd-ceb918071885","Type":"ContainerDied","Data":"fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8"} Oct 06 12:30:11 crc kubenswrapper[4892]: I1006 12:30:11.454453 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" event={"ID":"8108fb91-c94e-4419-b6cd-ceb918071885","Type":"ContainerStarted","Data":"8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a"} Oct 06 12:30:11 crc kubenswrapper[4892]: I1006 12:30:11.455115 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:11 crc kubenswrapper[4892]: I1006 12:30:11.494004 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" podStartSLOduration=3.493968325 podStartE2EDuration="3.493968325s" podCreationTimestamp="2025-10-06 12:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:30:11.48523792 +0000 UTC m=+1298.034943725" watchObservedRunningTime="2025-10-06 12:30:11.493968325 +0000 UTC m=+1298.043674130" Oct 06 12:30:18 crc kubenswrapper[4892]: I1006 12:30:18.839607 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:18 crc kubenswrapper[4892]: I1006 12:30:18.931047 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5988d6d7-6bx28"] Oct 06 12:30:18 crc kubenswrapper[4892]: I1006 12:30:18.931371 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" podUID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" containerName="dnsmasq-dns" containerID="cri-o://19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b" gracePeriod=10 Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.202895 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f784f866c-b2xx9"] Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.204743 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.218278 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f784f866c-b2xx9"] Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.293230 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-openstack-edpm-ipam\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.293658 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-dns-swift-storage-0\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.293713 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-ovsdbserver-nb\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.293760 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-config\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.293788 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbrxm\" (UniqueName: \"kubernetes.io/projected/aa0aa236-82a9-4c3c-9cdb-49515c29093d-kube-api-access-lbrxm\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.293848 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-dns-svc\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.293943 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.395794 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbrxm\" (UniqueName: \"kubernetes.io/projected/aa0aa236-82a9-4c3c-9cdb-49515c29093d-kube-api-access-lbrxm\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.395905 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-dns-svc\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.395947 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.396042 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-openstack-edpm-ipam\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.396100 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-dns-swift-storage-0\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.396144 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-ovsdbserver-nb\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.396183 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-config\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.397464 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-openstack-edpm-ipam\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.397491 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-dns-svc\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.397549 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-config\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.397873 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-dns-swift-storage-0\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.398078 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-ovsdbserver-sb\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.398498 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0aa236-82a9-4c3c-9cdb-49515c29093d-ovsdbserver-nb\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.419266 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbrxm\" (UniqueName: \"kubernetes.io/projected/aa0aa236-82a9-4c3c-9cdb-49515c29093d-kube-api-access-lbrxm\") pod \"dnsmasq-dns-7f784f866c-b2xx9\" (UID: \"aa0aa236-82a9-4c3c-9cdb-49515c29093d\") " pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.511027 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.532186 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.565288 4892 generic.go:334] "Generic (PLEG): container finished" podID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" containerID="19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b" exitCode=0 Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.565363 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" event={"ID":"d9454f4b-def4-456d-8a43-1bb27e3d89bc","Type":"ContainerDied","Data":"19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b"} Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.565417 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" event={"ID":"d9454f4b-def4-456d-8a43-1bb27e3d89bc","Type":"ContainerDied","Data":"1c9d86697adf389e53f4dcaa040e5f00f40b966540c7cab1d6832ce8aaee598e"} Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.565439 4892 scope.go:117] "RemoveContainer" containerID="19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.565630 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5988d6d7-6bx28" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.599089 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-swift-storage-0\") pod \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.599590 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-nb\") pod \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.599636 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-config\") pod \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.599675 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-sb\") pod \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.599779 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dnp4\" (UniqueName: \"kubernetes.io/projected/d9454f4b-def4-456d-8a43-1bb27e3d89bc-kube-api-access-9dnp4\") pod \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.599882 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-svc\") pod \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\" (UID: \"d9454f4b-def4-456d-8a43-1bb27e3d89bc\") " Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.607274 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9454f4b-def4-456d-8a43-1bb27e3d89bc-kube-api-access-9dnp4" (OuterVolumeSpecName: "kube-api-access-9dnp4") pod "d9454f4b-def4-456d-8a43-1bb27e3d89bc" (UID: "d9454f4b-def4-456d-8a43-1bb27e3d89bc"). InnerVolumeSpecName "kube-api-access-9dnp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.687199 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d9454f4b-def4-456d-8a43-1bb27e3d89bc" (UID: "d9454f4b-def4-456d-8a43-1bb27e3d89bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.690641 4892 scope.go:117] "RemoveContainer" containerID="7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.690921 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d9454f4b-def4-456d-8a43-1bb27e3d89bc" (UID: "d9454f4b-def4-456d-8a43-1bb27e3d89bc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.695123 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-config" (OuterVolumeSpecName: "config") pod "d9454f4b-def4-456d-8a43-1bb27e3d89bc" (UID: "d9454f4b-def4-456d-8a43-1bb27e3d89bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.703482 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.703508 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.703518 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.703526 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dnp4\" (UniqueName: \"kubernetes.io/projected/d9454f4b-def4-456d-8a43-1bb27e3d89bc-kube-api-access-9dnp4\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.714725 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d9454f4b-def4-456d-8a43-1bb27e3d89bc" (UID: "d9454f4b-def4-456d-8a43-1bb27e3d89bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.724532 4892 scope.go:117] "RemoveContainer" containerID="19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b" Oct 06 12:30:19 crc kubenswrapper[4892]: E1006 12:30:19.725163 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b\": container with ID starting with 19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b not found: ID does not exist" containerID="19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.725311 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b"} err="failed to get container status \"19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b\": rpc error: code = NotFound desc = could not find container \"19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b\": container with ID starting with 19a3fad34b3d6187b4df7783f9f4f36bce550c66077b02701b58cd54924c545b not found: ID does not exist" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.725527 4892 scope.go:117] "RemoveContainer" containerID="7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630" Oct 06 12:30:19 crc kubenswrapper[4892]: E1006 12:30:19.726055 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630\": container with ID starting with 7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630 not found: ID does not exist" containerID="7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.726090 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630"} err="failed to get container status \"7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630\": rpc error: code = NotFound desc = could not find container \"7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630\": container with ID starting with 7fbdcef1b45b4180269a16cc6a4491aac831901f16c77047cf824611bab91630 not found: ID does not exist" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.728480 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d9454f4b-def4-456d-8a43-1bb27e3d89bc" (UID: "d9454f4b-def4-456d-8a43-1bb27e3d89bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.805817 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.805853 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d9454f4b-def4-456d-8a43-1bb27e3d89bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.899044 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5988d6d7-6bx28"] Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.907739 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5988d6d7-6bx28"] Oct 06 12:30:19 crc kubenswrapper[4892]: I1006 12:30:19.976733 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f784f866c-b2xx9"] Oct 06 12:30:20 crc kubenswrapper[4892]: I1006 12:30:20.180842 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" path="/var/lib/kubelet/pods/d9454f4b-def4-456d-8a43-1bb27e3d89bc/volumes" Oct 06 12:30:20 crc kubenswrapper[4892]: I1006 12:30:20.578971 4892 generic.go:334] "Generic (PLEG): container finished" podID="aa0aa236-82a9-4c3c-9cdb-49515c29093d" containerID="80318309c30670704f06ba99d4de4e024402032955dfed30d02a419690c5005f" exitCode=0 Oct 06 12:30:20 crc kubenswrapper[4892]: I1006 12:30:20.579029 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" event={"ID":"aa0aa236-82a9-4c3c-9cdb-49515c29093d","Type":"ContainerDied","Data":"80318309c30670704f06ba99d4de4e024402032955dfed30d02a419690c5005f"} Oct 06 12:30:20 crc kubenswrapper[4892]: I1006 12:30:20.579062 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" event={"ID":"aa0aa236-82a9-4c3c-9cdb-49515c29093d","Type":"ContainerStarted","Data":"ef98daadba24c5fcb998e885a66e530092a8dfebfaa2cd84b0915fb3300c5367"} Oct 06 12:30:21 crc kubenswrapper[4892]: I1006 12:30:21.596959 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" event={"ID":"aa0aa236-82a9-4c3c-9cdb-49515c29093d","Type":"ContainerStarted","Data":"579cbe4af9a27cfda1f991fae8153852e213fda577a84aa30530705951eeb318"} Oct 06 12:30:21 crc kubenswrapper[4892]: I1006 12:30:21.597836 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:21 crc kubenswrapper[4892]: I1006 12:30:21.631719 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" podStartSLOduration=2.631690165 podStartE2EDuration="2.631690165s" podCreationTimestamp="2025-10-06 12:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:30:21.62571689 +0000 UTC m=+1308.175422655" watchObservedRunningTime="2025-10-06 12:30:21.631690165 +0000 UTC m=+1308.181395970" Oct 06 12:30:29 crc kubenswrapper[4892]: I1006 12:30:29.534437 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f784f866c-b2xx9" Oct 06 12:30:29 crc kubenswrapper[4892]: I1006 12:30:29.632704 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cb9bbdf5-wdkmt"] Oct 06 12:30:29 crc kubenswrapper[4892]: I1006 12:30:29.661082 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" podUID="8108fb91-c94e-4419-b6cd-ceb918071885" containerName="dnsmasq-dns" containerID="cri-o://8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a" gracePeriod=10 Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.188791 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.361646 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-svc\") pod \"8108fb91-c94e-4419-b6cd-ceb918071885\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.361767 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-swift-storage-0\") pod \"8108fb91-c94e-4419-b6cd-ceb918071885\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.361836 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-config\") pod \"8108fb91-c94e-4419-b6cd-ceb918071885\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.361906 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-sb\") pod \"8108fb91-c94e-4419-b6cd-ceb918071885\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.361941 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-openstack-edpm-ipam\") pod \"8108fb91-c94e-4419-b6cd-ceb918071885\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.362135 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-nb\") pod \"8108fb91-c94e-4419-b6cd-ceb918071885\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.362167 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf7jg\" (UniqueName: \"kubernetes.io/projected/8108fb91-c94e-4419-b6cd-ceb918071885-kube-api-access-rf7jg\") pod \"8108fb91-c94e-4419-b6cd-ceb918071885\" (UID: \"8108fb91-c94e-4419-b6cd-ceb918071885\") " Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.378829 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8108fb91-c94e-4419-b6cd-ceb918071885-kube-api-access-rf7jg" (OuterVolumeSpecName: "kube-api-access-rf7jg") pod "8108fb91-c94e-4419-b6cd-ceb918071885" (UID: "8108fb91-c94e-4419-b6cd-ceb918071885"). InnerVolumeSpecName "kube-api-access-rf7jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.417348 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8108fb91-c94e-4419-b6cd-ceb918071885" (UID: "8108fb91-c94e-4419-b6cd-ceb918071885"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.425147 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8108fb91-c94e-4419-b6cd-ceb918071885" (UID: "8108fb91-c94e-4419-b6cd-ceb918071885"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.434154 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-config" (OuterVolumeSpecName: "config") pod "8108fb91-c94e-4419-b6cd-ceb918071885" (UID: "8108fb91-c94e-4419-b6cd-ceb918071885"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.442084 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "8108fb91-c94e-4419-b6cd-ceb918071885" (UID: "8108fb91-c94e-4419-b6cd-ceb918071885"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.443393 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8108fb91-c94e-4419-b6cd-ceb918071885" (UID: "8108fb91-c94e-4419-b6cd-ceb918071885"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.451180 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8108fb91-c94e-4419-b6cd-ceb918071885" (UID: "8108fb91-c94e-4419-b6cd-ceb918071885"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.464428 4892 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.464464 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.464475 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.464486 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.464495 4892 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.464503 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf7jg\" (UniqueName: \"kubernetes.io/projected/8108fb91-c94e-4419-b6cd-ceb918071885-kube-api-access-rf7jg\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.464513 4892 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8108fb91-c94e-4419-b6cd-ceb918071885-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.714624 4892 generic.go:334] "Generic (PLEG): container finished" podID="8108fb91-c94e-4419-b6cd-ceb918071885" containerID="8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a" exitCode=0 Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.714696 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.714700 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" event={"ID":"8108fb91-c94e-4419-b6cd-ceb918071885","Type":"ContainerDied","Data":"8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a"} Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.714872 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cb9bbdf5-wdkmt" event={"ID":"8108fb91-c94e-4419-b6cd-ceb918071885","Type":"ContainerDied","Data":"b018a54e95186864f17c497942a6c222662e82e9a5d900a037f2f5bfa6832e94"} Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.714906 4892 scope.go:117] "RemoveContainer" containerID="8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.762107 4892 scope.go:117] "RemoveContainer" containerID="fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.764998 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cb9bbdf5-wdkmt"] Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.776919 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56cb9bbdf5-wdkmt"] Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.799789 4892 scope.go:117] "RemoveContainer" containerID="8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a" Oct 06 12:30:30 crc kubenswrapper[4892]: E1006 12:30:30.800335 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a\": container with ID starting with 8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a not found: ID does not exist" containerID="8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.800380 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a"} err="failed to get container status \"8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a\": rpc error: code = NotFound desc = could not find container \"8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a\": container with ID starting with 8ba8d380d452c591a744b034563b60408236f184bdc33fe775879e51c40f4b9a not found: ID does not exist" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.800410 4892 scope.go:117] "RemoveContainer" containerID="fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8" Oct 06 12:30:30 crc kubenswrapper[4892]: E1006 12:30:30.802046 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8\": container with ID starting with fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8 not found: ID does not exist" containerID="fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8" Oct 06 12:30:30 crc kubenswrapper[4892]: I1006 12:30:30.802133 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8"} err="failed to get container status \"fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8\": rpc error: code = NotFound desc = could not find container \"fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8\": container with ID starting with fa076f22396cfd16cf59505e3917ce462c56adef21c33e5a2b3b324923b044e8 not found: ID does not exist" Oct 06 12:30:32 crc kubenswrapper[4892]: I1006 12:30:32.185445 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8108fb91-c94e-4419-b6cd-ceb918071885" path="/var/lib/kubelet/pods/8108fb91-c94e-4419-b6cd-ceb918071885/volumes" Oct 06 12:30:34 crc kubenswrapper[4892]: I1006 12:30:34.771095 4892 generic.go:334] "Generic (PLEG): container finished" podID="b11ee41a-0493-4955-b081-d78b83730ec4" containerID="761b69eba7deeb557eedb09958ca9ee3d7beb0205bc05bd12b37ed21dd223650" exitCode=0 Oct 06 12:30:34 crc kubenswrapper[4892]: I1006 12:30:34.771215 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b11ee41a-0493-4955-b081-d78b83730ec4","Type":"ContainerDied","Data":"761b69eba7deeb557eedb09958ca9ee3d7beb0205bc05bd12b37ed21dd223650"} Oct 06 12:30:35 crc kubenswrapper[4892]: I1006 12:30:35.798179 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b11ee41a-0493-4955-b081-d78b83730ec4","Type":"ContainerStarted","Data":"e7197ed12f3fa4c7e043fcef0be22c1609526e1203cb33559a2cc5f8cefc3f7b"} Oct 06 12:30:35 crc kubenswrapper[4892]: I1006 12:30:35.798868 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 06 12:30:35 crc kubenswrapper[4892]: I1006 12:30:35.800771 4892 generic.go:334] "Generic (PLEG): container finished" podID="1ea9e651-a19b-445b-96dc-fd25c0df95f2" containerID="b8e4e8e4687fd0186f78884c3367954c99dffb220fafd23f5549eb4492f4c701" exitCode=0 Oct 06 12:30:35 crc kubenswrapper[4892]: I1006 12:30:35.800856 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ea9e651-a19b-445b-96dc-fd25c0df95f2","Type":"ContainerDied","Data":"b8e4e8e4687fd0186f78884c3367954c99dffb220fafd23f5549eb4492f4c701"} Oct 06 12:30:35 crc kubenswrapper[4892]: I1006 12:30:35.826221 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.826197174 podStartE2EDuration="37.826197174s" podCreationTimestamp="2025-10-06 12:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:30:35.825096022 +0000 UTC m=+1322.374801817" watchObservedRunningTime="2025-10-06 12:30:35.826197174 +0000 UTC m=+1322.375902979" Oct 06 12:30:36 crc kubenswrapper[4892]: I1006 12:30:36.811722 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ea9e651-a19b-445b-96dc-fd25c0df95f2","Type":"ContainerStarted","Data":"b746636086784ba78291a98037b88348a089eefaae588bf3b220b0260622dc4c"} Oct 06 12:30:36 crc kubenswrapper[4892]: I1006 12:30:36.864034 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.864012555 podStartE2EDuration="37.864012555s" podCreationTimestamp="2025-10-06 12:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:30:36.855613369 +0000 UTC m=+1323.405319164" watchObservedRunningTime="2025-10-06 12:30:36.864012555 +0000 UTC m=+1323.413718330" Oct 06 12:30:39 crc kubenswrapper[4892]: I1006 12:30:39.827957 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.953695 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4"] Oct 06 12:30:47 crc kubenswrapper[4892]: E1006 12:30:47.955868 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" containerName="init" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.957426 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" containerName="init" Oct 06 12:30:47 crc kubenswrapper[4892]: E1006 12:30:47.957517 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8108fb91-c94e-4419-b6cd-ceb918071885" containerName="init" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.957578 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8108fb91-c94e-4419-b6cd-ceb918071885" containerName="init" Oct 06 12:30:47 crc kubenswrapper[4892]: E1006 12:30:47.957652 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8108fb91-c94e-4419-b6cd-ceb918071885" containerName="dnsmasq-dns" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.957712 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8108fb91-c94e-4419-b6cd-ceb918071885" containerName="dnsmasq-dns" Oct 06 12:30:47 crc kubenswrapper[4892]: E1006 12:30:47.957818 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" containerName="dnsmasq-dns" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.957883 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" containerName="dnsmasq-dns" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.958211 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9454f4b-def4-456d-8a43-1bb27e3d89bc" containerName="dnsmasq-dns" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.958289 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8108fb91-c94e-4419-b6cd-ceb918071885" containerName="dnsmasq-dns" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.959173 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.967544 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.967628 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.968139 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.968872 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:30:47 crc kubenswrapper[4892]: I1006 12:30:47.995044 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4"] Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.020929 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.020975 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.021360 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crsl4\" (UniqueName: \"kubernetes.io/projected/9b014a50-d437-4fd0-9d31-aff86fbf851c-kube-api-access-crsl4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.021403 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.123217 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crsl4\" (UniqueName: \"kubernetes.io/projected/9b014a50-d437-4fd0-9d31-aff86fbf851c-kube-api-access-crsl4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.123261 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.123367 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.123385 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.129144 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.141902 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.153595 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.154969 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crsl4\" (UniqueName: \"kubernetes.io/projected/9b014a50-d437-4fd0-9d31-aff86fbf851c-kube-api-access-crsl4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.302540 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:30:48 crc kubenswrapper[4892]: I1006 12:30:48.808555 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 06 12:30:49 crc kubenswrapper[4892]: I1006 12:30:49.033977 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4"] Oct 06 12:30:49 crc kubenswrapper[4892]: I1006 12:30:49.833591 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 06 12:30:49 crc kubenswrapper[4892]: I1006 12:30:49.963666 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" event={"ID":"9b014a50-d437-4fd0-9d31-aff86fbf851c","Type":"ContainerStarted","Data":"3ba15b02cf9ee977b30f8f6368036903ab03ae0e80d4df62903c31b912719fd2"} Oct 06 12:30:57 crc kubenswrapper[4892]: I1006 12:30:57.191809 4892 scope.go:117] "RemoveContainer" containerID="f6e35bb22f41354b2219d72a67a39722041f630b90a98e5e79da11808ed8e91a" Oct 06 12:31:00 crc kubenswrapper[4892]: I1006 12:31:00.082502 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" event={"ID":"9b014a50-d437-4fd0-9d31-aff86fbf851c","Type":"ContainerStarted","Data":"e7b186288b9bca0c3a1d4b67c7c24af131e1a38b71afffeeedfc6a69f9959a17"} Oct 06 12:31:00 crc kubenswrapper[4892]: I1006 12:31:00.122156 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" podStartSLOduration=2.772903577 podStartE2EDuration="13.122135921s" podCreationTimestamp="2025-10-06 12:30:47 +0000 UTC" firstStartedPulling="2025-10-06 12:30:49.062154421 +0000 UTC m=+1335.611860186" lastFinishedPulling="2025-10-06 12:30:59.411386765 +0000 UTC m=+1345.961092530" observedRunningTime="2025-10-06 12:31:00.10499015 +0000 UTC m=+1346.654695915" watchObservedRunningTime="2025-10-06 12:31:00.122135921 +0000 UTC m=+1346.671841696" Oct 06 12:31:11 crc kubenswrapper[4892]: I1006 12:31:11.236132 4892 generic.go:334] "Generic (PLEG): container finished" podID="9b014a50-d437-4fd0-9d31-aff86fbf851c" containerID="e7b186288b9bca0c3a1d4b67c7c24af131e1a38b71afffeeedfc6a69f9959a17" exitCode=0 Oct 06 12:31:11 crc kubenswrapper[4892]: I1006 12:31:11.237786 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" event={"ID":"9b014a50-d437-4fd0-9d31-aff86fbf851c","Type":"ContainerDied","Data":"e7b186288b9bca0c3a1d4b67c7c24af131e1a38b71afffeeedfc6a69f9959a17"} Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.853185 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.896711 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-repo-setup-combined-ca-bundle\") pod \"9b014a50-d437-4fd0-9d31-aff86fbf851c\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.896853 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crsl4\" (UniqueName: \"kubernetes.io/projected/9b014a50-d437-4fd0-9d31-aff86fbf851c-kube-api-access-crsl4\") pod \"9b014a50-d437-4fd0-9d31-aff86fbf851c\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.896937 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-inventory\") pod \"9b014a50-d437-4fd0-9d31-aff86fbf851c\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.897182 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-ssh-key\") pod \"9b014a50-d437-4fd0-9d31-aff86fbf851c\" (UID: \"9b014a50-d437-4fd0-9d31-aff86fbf851c\") " Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.903646 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b014a50-d437-4fd0-9d31-aff86fbf851c-kube-api-access-crsl4" (OuterVolumeSpecName: "kube-api-access-crsl4") pod "9b014a50-d437-4fd0-9d31-aff86fbf851c" (UID: "9b014a50-d437-4fd0-9d31-aff86fbf851c"). InnerVolumeSpecName "kube-api-access-crsl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.904419 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "9b014a50-d437-4fd0-9d31-aff86fbf851c" (UID: "9b014a50-d437-4fd0-9d31-aff86fbf851c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.925640 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b014a50-d437-4fd0-9d31-aff86fbf851c" (UID: "9b014a50-d437-4fd0-9d31-aff86fbf851c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:31:12 crc kubenswrapper[4892]: I1006 12:31:12.927072 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-inventory" (OuterVolumeSpecName: "inventory") pod "9b014a50-d437-4fd0-9d31-aff86fbf851c" (UID: "9b014a50-d437-4fd0-9d31-aff86fbf851c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.002432 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.002469 4892 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.002486 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crsl4\" (UniqueName: \"kubernetes.io/projected/9b014a50-d437-4fd0-9d31-aff86fbf851c-kube-api-access-crsl4\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.002502 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b014a50-d437-4fd0-9d31-aff86fbf851c-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.264061 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" event={"ID":"9b014a50-d437-4fd0-9d31-aff86fbf851c","Type":"ContainerDied","Data":"3ba15b02cf9ee977b30f8f6368036903ab03ae0e80d4df62903c31b912719fd2"} Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.264136 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ba15b02cf9ee977b30f8f6368036903ab03ae0e80d4df62903c31b912719fd2" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.264176 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.386519 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl"] Oct 06 12:31:13 crc kubenswrapper[4892]: E1006 12:31:13.387073 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b014a50-d437-4fd0-9d31-aff86fbf851c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.387093 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b014a50-d437-4fd0-9d31-aff86fbf851c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.387443 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b014a50-d437-4fd0-9d31-aff86fbf851c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.388294 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.390855 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.391193 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.391460 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.394981 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.400110 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl"] Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.514476 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.514894 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.515064 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nzjc\" (UniqueName: \"kubernetes.io/projected/4f608c5b-99de-42b3-83c7-9a514aa5e54b-kube-api-access-9nzjc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.616829 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nzjc\" (UniqueName: \"kubernetes.io/projected/4f608c5b-99de-42b3-83c7-9a514aa5e54b-kube-api-access-9nzjc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.616908 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.617031 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.626808 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.628068 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.648015 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nzjc\" (UniqueName: \"kubernetes.io/projected/4f608c5b-99de-42b3-83c7-9a514aa5e54b-kube-api-access-9nzjc\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wrdgl\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:13 crc kubenswrapper[4892]: I1006 12:31:13.711036 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:14 crc kubenswrapper[4892]: I1006 12:31:14.301843 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl"] Oct 06 12:31:15 crc kubenswrapper[4892]: I1006 12:31:15.288129 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" event={"ID":"4f608c5b-99de-42b3-83c7-9a514aa5e54b","Type":"ContainerStarted","Data":"d83c7f6981e12e8de639a3949b8cc72383e409bde247a7991cbdeb75fc0909cb"} Oct 06 12:31:15 crc kubenswrapper[4892]: I1006 12:31:15.289523 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" event={"ID":"4f608c5b-99de-42b3-83c7-9a514aa5e54b","Type":"ContainerStarted","Data":"d40eb619c47ed756f0f848c29c5d24d428d2d343b7d59c8469fe8e93d32ddaff"} Oct 06 12:31:15 crc kubenswrapper[4892]: I1006 12:31:15.330910 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" podStartSLOduration=1.741622931 podStartE2EDuration="2.330875964s" podCreationTimestamp="2025-10-06 12:31:13 +0000 UTC" firstStartedPulling="2025-10-06 12:31:14.316182299 +0000 UTC m=+1360.865888054" lastFinishedPulling="2025-10-06 12:31:14.905435322 +0000 UTC m=+1361.455141087" observedRunningTime="2025-10-06 12:31:15.315561636 +0000 UTC m=+1361.865267441" watchObservedRunningTime="2025-10-06 12:31:15.330875964 +0000 UTC m=+1361.880581729" Oct 06 12:31:18 crc kubenswrapper[4892]: I1006 12:31:18.330367 4892 generic.go:334] "Generic (PLEG): container finished" podID="4f608c5b-99de-42b3-83c7-9a514aa5e54b" containerID="d83c7f6981e12e8de639a3949b8cc72383e409bde247a7991cbdeb75fc0909cb" exitCode=0 Oct 06 12:31:18 crc kubenswrapper[4892]: I1006 12:31:18.330449 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" event={"ID":"4f608c5b-99de-42b3-83c7-9a514aa5e54b","Type":"ContainerDied","Data":"d83c7f6981e12e8de639a3949b8cc72383e409bde247a7991cbdeb75fc0909cb"} Oct 06 12:31:19 crc kubenswrapper[4892]: I1006 12:31:19.909516 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:19 crc kubenswrapper[4892]: I1006 12:31:19.964373 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-ssh-key\") pod \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " Oct 06 12:31:19 crc kubenswrapper[4892]: I1006 12:31:19.964518 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-inventory\") pod \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " Oct 06 12:31:19 crc kubenswrapper[4892]: I1006 12:31:19.964550 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nzjc\" (UniqueName: \"kubernetes.io/projected/4f608c5b-99de-42b3-83c7-9a514aa5e54b-kube-api-access-9nzjc\") pod \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\" (UID: \"4f608c5b-99de-42b3-83c7-9a514aa5e54b\") " Oct 06 12:31:19 crc kubenswrapper[4892]: I1006 12:31:19.970144 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f608c5b-99de-42b3-83c7-9a514aa5e54b-kube-api-access-9nzjc" (OuterVolumeSpecName: "kube-api-access-9nzjc") pod "4f608c5b-99de-42b3-83c7-9a514aa5e54b" (UID: "4f608c5b-99de-42b3-83c7-9a514aa5e54b"). InnerVolumeSpecName "kube-api-access-9nzjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:31:19 crc kubenswrapper[4892]: I1006 12:31:19.994743 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-inventory" (OuterVolumeSpecName: "inventory") pod "4f608c5b-99de-42b3-83c7-9a514aa5e54b" (UID: "4f608c5b-99de-42b3-83c7-9a514aa5e54b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.009062 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f608c5b-99de-42b3-83c7-9a514aa5e54b" (UID: "4f608c5b-99de-42b3-83c7-9a514aa5e54b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.067549 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.067596 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f608c5b-99de-42b3-83c7-9a514aa5e54b-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.067619 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nzjc\" (UniqueName: \"kubernetes.io/projected/4f608c5b-99de-42b3-83c7-9a514aa5e54b-kube-api-access-9nzjc\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.392400 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" event={"ID":"4f608c5b-99de-42b3-83c7-9a514aa5e54b","Type":"ContainerDied","Data":"d40eb619c47ed756f0f848c29c5d24d428d2d343b7d59c8469fe8e93d32ddaff"} Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.392465 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d40eb619c47ed756f0f848c29c5d24d428d2d343b7d59c8469fe8e93d32ddaff" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.392495 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wrdgl" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.475853 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv"] Oct 06 12:31:20 crc kubenswrapper[4892]: E1006 12:31:20.476671 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f608c5b-99de-42b3-83c7-9a514aa5e54b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.476763 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f608c5b-99de-42b3-83c7-9a514aa5e54b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.477096 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f608c5b-99de-42b3-83c7-9a514aa5e54b" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.478018 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.484227 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.484886 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.485262 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.489768 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.493658 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv"] Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.579165 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.579577 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2dgv\" (UniqueName: \"kubernetes.io/projected/de624448-d17e-48b7-a11b-bcbd70fa860f-kube-api-access-h2dgv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.579677 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.580036 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.683003 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.683155 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.683397 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.683476 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2dgv\" (UniqueName: \"kubernetes.io/projected/de624448-d17e-48b7-a11b-bcbd70fa860f-kube-api-access-h2dgv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.689039 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.689062 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.691653 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.711658 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2dgv\" (UniqueName: \"kubernetes.io/projected/de624448-d17e-48b7-a11b-bcbd70fa860f-kube-api-access-h2dgv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:20 crc kubenswrapper[4892]: I1006 12:31:20.800806 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:31:21 crc kubenswrapper[4892]: I1006 12:31:21.392720 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv"] Oct 06 12:31:21 crc kubenswrapper[4892]: I1006 12:31:21.405963 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" event={"ID":"de624448-d17e-48b7-a11b-bcbd70fa860f","Type":"ContainerStarted","Data":"1347dda7a239c02a87e82421fc977bb16b57205e2b8fad3e05494ed53788e7ff"} Oct 06 12:31:22 crc kubenswrapper[4892]: I1006 12:31:22.421944 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" event={"ID":"de624448-d17e-48b7-a11b-bcbd70fa860f","Type":"ContainerStarted","Data":"2ce270197dc64fda966b8d07709cb05bb8c37593f55236298f581cd679f95c57"} Oct 06 12:31:22 crc kubenswrapper[4892]: I1006 12:31:22.448253 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" podStartSLOduration=1.931065276 podStartE2EDuration="2.448228226s" podCreationTimestamp="2025-10-06 12:31:20 +0000 UTC" firstStartedPulling="2025-10-06 12:31:21.385904978 +0000 UTC m=+1367.935610753" lastFinishedPulling="2025-10-06 12:31:21.903067898 +0000 UTC m=+1368.452773703" observedRunningTime="2025-10-06 12:31:22.446264909 +0000 UTC m=+1368.995970714" watchObservedRunningTime="2025-10-06 12:31:22.448228226 +0000 UTC m=+1368.997934031" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.592368 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7tk89"] Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.596405 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.624381 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tk89"] Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.660057 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-catalog-content\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.660358 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqrf\" (UniqueName: \"kubernetes.io/projected/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-kube-api-access-wrqrf\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.660681 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-utilities\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.762572 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-catalog-content\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.762823 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqrf\" (UniqueName: \"kubernetes.io/projected/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-kube-api-access-wrqrf\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.763068 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-utilities\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.763154 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-catalog-content\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.763662 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-utilities\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.795935 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqrf\" (UniqueName: \"kubernetes.io/projected/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-kube-api-access-wrqrf\") pod \"redhat-operators-7tk89\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:31 crc kubenswrapper[4892]: I1006 12:31:31.944316 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:32 crc kubenswrapper[4892]: I1006 12:31:32.424104 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7tk89"] Oct 06 12:31:32 crc kubenswrapper[4892]: W1006 12:31:32.426038 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86eb8e3b_6149_4003_bef3_4f7b6b34e2b1.slice/crio-c9006c4ee70d0daa7cfbf5ed94cea6a8637d9a2dd514e4fa52f1ad9f5eb6276d WatchSource:0}: Error finding container c9006c4ee70d0daa7cfbf5ed94cea6a8637d9a2dd514e4fa52f1ad9f5eb6276d: Status 404 returned error can't find the container with id c9006c4ee70d0daa7cfbf5ed94cea6a8637d9a2dd514e4fa52f1ad9f5eb6276d Oct 06 12:31:32 crc kubenswrapper[4892]: I1006 12:31:32.544555 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tk89" event={"ID":"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1","Type":"ContainerStarted","Data":"c9006c4ee70d0daa7cfbf5ed94cea6a8637d9a2dd514e4fa52f1ad9f5eb6276d"} Oct 06 12:31:33 crc kubenswrapper[4892]: I1006 12:31:33.559614 4892 generic.go:334] "Generic (PLEG): container finished" podID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerID="84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5" exitCode=0 Oct 06 12:31:33 crc kubenswrapper[4892]: I1006 12:31:33.559673 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tk89" event={"ID":"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1","Type":"ContainerDied","Data":"84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5"} Oct 06 12:31:35 crc kubenswrapper[4892]: I1006 12:31:35.588492 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tk89" event={"ID":"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1","Type":"ContainerStarted","Data":"89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780"} Oct 06 12:31:36 crc kubenswrapper[4892]: I1006 12:31:36.607853 4892 generic.go:334] "Generic (PLEG): container finished" podID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerID="89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780" exitCode=0 Oct 06 12:31:36 crc kubenswrapper[4892]: I1006 12:31:36.607933 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tk89" event={"ID":"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1","Type":"ContainerDied","Data":"89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780"} Oct 06 12:31:37 crc kubenswrapper[4892]: I1006 12:31:37.625871 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tk89" event={"ID":"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1","Type":"ContainerStarted","Data":"bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f"} Oct 06 12:31:37 crc kubenswrapper[4892]: I1006 12:31:37.648719 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7tk89" podStartSLOduration=2.9960546040000002 podStartE2EDuration="6.648699932s" podCreationTimestamp="2025-10-06 12:31:31 +0000 UTC" firstStartedPulling="2025-10-06 12:31:33.563262527 +0000 UTC m=+1380.112968292" lastFinishedPulling="2025-10-06 12:31:37.215907815 +0000 UTC m=+1383.765613620" observedRunningTime="2025-10-06 12:31:37.643153262 +0000 UTC m=+1384.192859047" watchObservedRunningTime="2025-10-06 12:31:37.648699932 +0000 UTC m=+1384.198405707" Oct 06 12:31:41 crc kubenswrapper[4892]: I1006 12:31:41.945084 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:41 crc kubenswrapper[4892]: I1006 12:31:41.945344 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:43 crc kubenswrapper[4892]: I1006 12:31:43.004383 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7tk89" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="registry-server" probeResult="failure" output=< Oct 06 12:31:43 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Oct 06 12:31:43 crc kubenswrapper[4892]: > Oct 06 12:31:52 crc kubenswrapper[4892]: I1006 12:31:52.015599 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:52 crc kubenswrapper[4892]: I1006 12:31:52.103019 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:52 crc kubenswrapper[4892]: I1006 12:31:52.268122 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tk89"] Oct 06 12:31:53 crc kubenswrapper[4892]: I1006 12:31:53.802809 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7tk89" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="registry-server" containerID="cri-o://bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f" gracePeriod=2 Oct 06 12:31:54 crc kubenswrapper[4892]: E1006 12:31:54.045681 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86eb8e3b_6149_4003_bef3_4f7b6b34e2b1.slice/crio-bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.276189 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.415831 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrqrf\" (UniqueName: \"kubernetes.io/projected/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-kube-api-access-wrqrf\") pod \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.415898 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-utilities\") pod \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.416024 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-catalog-content\") pod \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\" (UID: \"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1\") " Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.417864 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-utilities" (OuterVolumeSpecName: "utilities") pod "86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" (UID: "86eb8e3b-6149-4003-bef3-4f7b6b34e2b1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.423117 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-kube-api-access-wrqrf" (OuterVolumeSpecName: "kube-api-access-wrqrf") pod "86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" (UID: "86eb8e3b-6149-4003-bef3-4f7b6b34e2b1"). InnerVolumeSpecName "kube-api-access-wrqrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.505108 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" (UID: "86eb8e3b-6149-4003-bef3-4f7b6b34e2b1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.518843 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrqrf\" (UniqueName: \"kubernetes.io/projected/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-kube-api-access-wrqrf\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.518884 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.518900 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.819999 4892 generic.go:334] "Generic (PLEG): container finished" podID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerID="bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f" exitCode=0 Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.820067 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tk89" event={"ID":"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1","Type":"ContainerDied","Data":"bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f"} Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.821212 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7tk89" event={"ID":"86eb8e3b-6149-4003-bef3-4f7b6b34e2b1","Type":"ContainerDied","Data":"c9006c4ee70d0daa7cfbf5ed94cea6a8637d9a2dd514e4fa52f1ad9f5eb6276d"} Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.820088 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7tk89" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.821264 4892 scope.go:117] "RemoveContainer" containerID="bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.863652 4892 scope.go:117] "RemoveContainer" containerID="89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.873377 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7tk89"] Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.884167 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7tk89"] Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.906770 4892 scope.go:117] "RemoveContainer" containerID="84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.978202 4892 scope.go:117] "RemoveContainer" containerID="bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f" Oct 06 12:31:54 crc kubenswrapper[4892]: E1006 12:31:54.978826 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f\": container with ID starting with bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f not found: ID does not exist" containerID="bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.978877 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f"} err="failed to get container status \"bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f\": rpc error: code = NotFound desc = could not find container \"bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f\": container with ID starting with bd79069ef8b7f0fe4908c380bc08f771399efea8513a260c870fbd090369c59f not found: ID does not exist" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.978910 4892 scope.go:117] "RemoveContainer" containerID="89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780" Oct 06 12:31:54 crc kubenswrapper[4892]: E1006 12:31:54.979372 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780\": container with ID starting with 89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780 not found: ID does not exist" containerID="89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.979481 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780"} err="failed to get container status \"89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780\": rpc error: code = NotFound desc = could not find container \"89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780\": container with ID starting with 89be45b71c7d6a2d033dcc342f74c592aa1fe75ee5dd77d0e86fe9a37c4eb780 not found: ID does not exist" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.979588 4892 scope.go:117] "RemoveContainer" containerID="84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5" Oct 06 12:31:54 crc kubenswrapper[4892]: E1006 12:31:54.980221 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5\": container with ID starting with 84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5 not found: ID does not exist" containerID="84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5" Oct 06 12:31:54 crc kubenswrapper[4892]: I1006 12:31:54.980335 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5"} err="failed to get container status \"84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5\": rpc error: code = NotFound desc = could not find container \"84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5\": container with ID starting with 84be594e49e1b4b04df20befc6b5ae7cc1eddf971d6d32c77d39981d543b3af5 not found: ID does not exist" Oct 06 12:31:56 crc kubenswrapper[4892]: I1006 12:31:56.189489 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" path="/var/lib/kubelet/pods/86eb8e3b-6149-4003-bef3-4f7b6b34e2b1/volumes" Oct 06 12:31:59 crc kubenswrapper[4892]: I1006 12:31:59.425390 4892 scope.go:117] "RemoveContainer" containerID="dd0c8389a544aca5952a2124e2ec0fbe7acc84d0f49e1af11649de19467053dc" Oct 06 12:31:59 crc kubenswrapper[4892]: I1006 12:31:59.476052 4892 scope.go:117] "RemoveContainer" containerID="ba8d2ebf65b90cf60310ced4e4fa82489e0b79f9e8a427798a4590c8aa265643" Oct 06 12:31:59 crc kubenswrapper[4892]: I1006 12:31:59.515816 4892 scope.go:117] "RemoveContainer" containerID="165b75baeec76a8834f499781ad98fd3c24d3fd287af404da8cbe5e88305d1db" Oct 06 12:31:59 crc kubenswrapper[4892]: I1006 12:31:59.590710 4892 scope.go:117] "RemoveContainer" containerID="f692fc4783157738a25bd255555118a834fe52a60c84f1db96156f50891b5ea4" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.017599 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6kb7r"] Oct 06 12:32:21 crc kubenswrapper[4892]: E1006 12:32:21.019228 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="registry-server" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.019253 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="registry-server" Oct 06 12:32:21 crc kubenswrapper[4892]: E1006 12:32:21.019287 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="extract-content" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.019300 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="extract-content" Oct 06 12:32:21 crc kubenswrapper[4892]: E1006 12:32:21.019401 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="extract-utilities" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.019415 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="extract-utilities" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.020012 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="86eb8e3b-6149-4003-bef3-4f7b6b34e2b1" containerName="registry-server" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.023729 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.036438 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kb7r"] Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.109265 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-utilities\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.109493 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-catalog-content\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.109831 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmwdz\" (UniqueName: \"kubernetes.io/projected/02aa27bd-a533-4b5c-877e-381c28951c46-kube-api-access-xmwdz\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.212016 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-utilities\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.212178 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-catalog-content\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.212312 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmwdz\" (UniqueName: \"kubernetes.io/projected/02aa27bd-a533-4b5c-877e-381c28951c46-kube-api-access-xmwdz\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.212843 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-utilities\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.212848 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-catalog-content\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.233408 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmwdz\" (UniqueName: \"kubernetes.io/projected/02aa27bd-a533-4b5c-877e-381c28951c46-kube-api-access-xmwdz\") pod \"redhat-marketplace-6kb7r\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.363748 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:21 crc kubenswrapper[4892]: I1006 12:32:21.839427 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kb7r"] Oct 06 12:32:22 crc kubenswrapper[4892]: I1006 12:32:22.207184 4892 generic.go:334] "Generic (PLEG): container finished" podID="02aa27bd-a533-4b5c-877e-381c28951c46" containerID="66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347" exitCode=0 Oct 06 12:32:22 crc kubenswrapper[4892]: I1006 12:32:22.207239 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kb7r" event={"ID":"02aa27bd-a533-4b5c-877e-381c28951c46","Type":"ContainerDied","Data":"66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347"} Oct 06 12:32:22 crc kubenswrapper[4892]: I1006 12:32:22.207275 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kb7r" event={"ID":"02aa27bd-a533-4b5c-877e-381c28951c46","Type":"ContainerStarted","Data":"07c8e040eb1f1e14c8672c00bf87518ea2ed926f550d97c07d245b46e023a28f"} Oct 06 12:32:22 crc kubenswrapper[4892]: I1006 12:32:22.985484 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:32:22 crc kubenswrapper[4892]: I1006 12:32:22.985787 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:32:23 crc kubenswrapper[4892]: I1006 12:32:23.217627 4892 generic.go:334] "Generic (PLEG): container finished" podID="02aa27bd-a533-4b5c-877e-381c28951c46" containerID="362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834" exitCode=0 Oct 06 12:32:23 crc kubenswrapper[4892]: I1006 12:32:23.217732 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kb7r" event={"ID":"02aa27bd-a533-4b5c-877e-381c28951c46","Type":"ContainerDied","Data":"362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834"} Oct 06 12:32:24 crc kubenswrapper[4892]: I1006 12:32:24.228165 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kb7r" event={"ID":"02aa27bd-a533-4b5c-877e-381c28951c46","Type":"ContainerStarted","Data":"4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2"} Oct 06 12:32:24 crc kubenswrapper[4892]: I1006 12:32:24.249018 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6kb7r" podStartSLOduration=2.742099694 podStartE2EDuration="4.248993989s" podCreationTimestamp="2025-10-06 12:32:20 +0000 UTC" firstStartedPulling="2025-10-06 12:32:22.208656146 +0000 UTC m=+1428.758361911" lastFinishedPulling="2025-10-06 12:32:23.715550431 +0000 UTC m=+1430.265256206" observedRunningTime="2025-10-06 12:32:24.243357857 +0000 UTC m=+1430.793063632" watchObservedRunningTime="2025-10-06 12:32:24.248993989 +0000 UTC m=+1430.798699754" Oct 06 12:32:31 crc kubenswrapper[4892]: I1006 12:32:31.364851 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:31 crc kubenswrapper[4892]: I1006 12:32:31.365477 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:31 crc kubenswrapper[4892]: I1006 12:32:31.438671 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:32 crc kubenswrapper[4892]: I1006 12:32:32.393900 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:32 crc kubenswrapper[4892]: I1006 12:32:32.458229 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kb7r"] Oct 06 12:32:34 crc kubenswrapper[4892]: I1006 12:32:34.343221 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6kb7r" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" containerName="registry-server" containerID="cri-o://4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2" gracePeriod=2 Oct 06 12:32:34 crc kubenswrapper[4892]: I1006 12:32:34.790065 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:34 crc kubenswrapper[4892]: I1006 12:32:34.912229 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-utilities\") pod \"02aa27bd-a533-4b5c-877e-381c28951c46\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " Oct 06 12:32:34 crc kubenswrapper[4892]: I1006 12:32:34.912287 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-catalog-content\") pod \"02aa27bd-a533-4b5c-877e-381c28951c46\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " Oct 06 12:32:34 crc kubenswrapper[4892]: I1006 12:32:34.912540 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmwdz\" (UniqueName: \"kubernetes.io/projected/02aa27bd-a533-4b5c-877e-381c28951c46-kube-api-access-xmwdz\") pod \"02aa27bd-a533-4b5c-877e-381c28951c46\" (UID: \"02aa27bd-a533-4b5c-877e-381c28951c46\") " Oct 06 12:32:34 crc kubenswrapper[4892]: I1006 12:32:34.914013 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-utilities" (OuterVolumeSpecName: "utilities") pod "02aa27bd-a533-4b5c-877e-381c28951c46" (UID: "02aa27bd-a533-4b5c-877e-381c28951c46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:32:34 crc kubenswrapper[4892]: I1006 12:32:34.920101 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02aa27bd-a533-4b5c-877e-381c28951c46-kube-api-access-xmwdz" (OuterVolumeSpecName: "kube-api-access-xmwdz") pod "02aa27bd-a533-4b5c-877e-381c28951c46" (UID: "02aa27bd-a533-4b5c-877e-381c28951c46"). InnerVolumeSpecName "kube-api-access-xmwdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:32:34 crc kubenswrapper[4892]: I1006 12:32:34.926128 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02aa27bd-a533-4b5c-877e-381c28951c46" (UID: "02aa27bd-a533-4b5c-877e-381c28951c46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.014194 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.014221 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02aa27bd-a533-4b5c-877e-381c28951c46-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.014233 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmwdz\" (UniqueName: \"kubernetes.io/projected/02aa27bd-a533-4b5c-877e-381c28951c46-kube-api-access-xmwdz\") on node \"crc\" DevicePath \"\"" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.355248 4892 generic.go:334] "Generic (PLEG): container finished" podID="02aa27bd-a533-4b5c-877e-381c28951c46" containerID="4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2" exitCode=0 Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.355310 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kb7r" event={"ID":"02aa27bd-a533-4b5c-877e-381c28951c46","Type":"ContainerDied","Data":"4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2"} Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.355396 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6kb7r" event={"ID":"02aa27bd-a533-4b5c-877e-381c28951c46","Type":"ContainerDied","Data":"07c8e040eb1f1e14c8672c00bf87518ea2ed926f550d97c07d245b46e023a28f"} Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.355421 4892 scope.go:117] "RemoveContainer" containerID="4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.356474 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6kb7r" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.388863 4892 scope.go:117] "RemoveContainer" containerID="362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.399962 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kb7r"] Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.410614 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6kb7r"] Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.414434 4892 scope.go:117] "RemoveContainer" containerID="66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.475381 4892 scope.go:117] "RemoveContainer" containerID="4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2" Oct 06 12:32:35 crc kubenswrapper[4892]: E1006 12:32:35.476083 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2\": container with ID starting with 4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2 not found: ID does not exist" containerID="4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.476117 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2"} err="failed to get container status \"4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2\": rpc error: code = NotFound desc = could not find container \"4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2\": container with ID starting with 4a7f807ff67db39a5a182bf0f9cc14199da8c85afa8e876a6a1193a8455a31e2 not found: ID does not exist" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.476139 4892 scope.go:117] "RemoveContainer" containerID="362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834" Oct 06 12:32:35 crc kubenswrapper[4892]: E1006 12:32:35.476796 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834\": container with ID starting with 362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834 not found: ID does not exist" containerID="362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.476823 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834"} err="failed to get container status \"362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834\": rpc error: code = NotFound desc = could not find container \"362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834\": container with ID starting with 362a3687baa713382023854e947a9eeffe303bd3c3d975cd4329feb74feb3834 not found: ID does not exist" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.476847 4892 scope.go:117] "RemoveContainer" containerID="66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347" Oct 06 12:32:35 crc kubenswrapper[4892]: E1006 12:32:35.477204 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347\": container with ID starting with 66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347 not found: ID does not exist" containerID="66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347" Oct 06 12:32:35 crc kubenswrapper[4892]: I1006 12:32:35.477254 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347"} err="failed to get container status \"66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347\": rpc error: code = NotFound desc = could not find container \"66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347\": container with ID starting with 66c557d32bc8703b11a5cccf8c2354dc50fb7401111bac019f3b31e99b47e347 not found: ID does not exist" Oct 06 12:32:36 crc kubenswrapper[4892]: I1006 12:32:36.183243 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" path="/var/lib/kubelet/pods/02aa27bd-a533-4b5c-877e-381c28951c46/volumes" Oct 06 12:32:52 crc kubenswrapper[4892]: I1006 12:32:52.984406 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:32:52 crc kubenswrapper[4892]: I1006 12:32:52.985316 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:32:59 crc kubenswrapper[4892]: I1006 12:32:59.742231 4892 scope.go:117] "RemoveContainer" containerID="574db5012a0285d7a89376db584b59015d118e027c004d85d5d7c924592b50e0" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.212068 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9qc26"] Oct 06 12:33:15 crc kubenswrapper[4892]: E1006 12:33:15.213148 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" containerName="extract-utilities" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.213166 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" containerName="extract-utilities" Oct 06 12:33:15 crc kubenswrapper[4892]: E1006 12:33:15.213184 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" containerName="registry-server" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.213192 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" containerName="registry-server" Oct 06 12:33:15 crc kubenswrapper[4892]: E1006 12:33:15.213209 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" containerName="extract-content" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.213217 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" containerName="extract-content" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.213507 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="02aa27bd-a533-4b5c-877e-381c28951c46" containerName="registry-server" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.215365 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.231288 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qc26"] Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.324285 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-utilities\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.324591 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6wqq\" (UniqueName: \"kubernetes.io/projected/16c9b412-439b-42df-975c-1bd741466d01-kube-api-access-d6wqq\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.324634 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-catalog-content\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.426629 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-utilities\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.426914 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6wqq\" (UniqueName: \"kubernetes.io/projected/16c9b412-439b-42df-975c-1bd741466d01-kube-api-access-d6wqq\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.426959 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-catalog-content\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.427211 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-utilities\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.427715 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-catalog-content\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.454963 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6wqq\" (UniqueName: \"kubernetes.io/projected/16c9b412-439b-42df-975c-1bd741466d01-kube-api-access-d6wqq\") pod \"certified-operators-9qc26\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:15 crc kubenswrapper[4892]: I1006 12:33:15.548618 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:16 crc kubenswrapper[4892]: I1006 12:33:16.044087 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9qc26"] Oct 06 12:33:16 crc kubenswrapper[4892]: W1006 12:33:16.052930 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c9b412_439b_42df_975c_1bd741466d01.slice/crio-fe0618bb3f7c20e9c43435462385ce8a514fcf090ea009fc75f84cb774386668 WatchSource:0}: Error finding container fe0618bb3f7c20e9c43435462385ce8a514fcf090ea009fc75f84cb774386668: Status 404 returned error can't find the container with id fe0618bb3f7c20e9c43435462385ce8a514fcf090ea009fc75f84cb774386668 Oct 06 12:33:16 crc kubenswrapper[4892]: I1006 12:33:16.901029 4892 generic.go:334] "Generic (PLEG): container finished" podID="16c9b412-439b-42df-975c-1bd741466d01" containerID="8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2" exitCode=0 Oct 06 12:33:16 crc kubenswrapper[4892]: I1006 12:33:16.901137 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qc26" event={"ID":"16c9b412-439b-42df-975c-1bd741466d01","Type":"ContainerDied","Data":"8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2"} Oct 06 12:33:16 crc kubenswrapper[4892]: I1006 12:33:16.901545 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qc26" event={"ID":"16c9b412-439b-42df-975c-1bd741466d01","Type":"ContainerStarted","Data":"fe0618bb3f7c20e9c43435462385ce8a514fcf090ea009fc75f84cb774386668"} Oct 06 12:33:18 crc kubenswrapper[4892]: I1006 12:33:18.927272 4892 generic.go:334] "Generic (PLEG): container finished" podID="16c9b412-439b-42df-975c-1bd741466d01" containerID="6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc" exitCode=0 Oct 06 12:33:18 crc kubenswrapper[4892]: I1006 12:33:18.927360 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qc26" event={"ID":"16c9b412-439b-42df-975c-1bd741466d01","Type":"ContainerDied","Data":"6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc"} Oct 06 12:33:19 crc kubenswrapper[4892]: I1006 12:33:19.940451 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qc26" event={"ID":"16c9b412-439b-42df-975c-1bd741466d01","Type":"ContainerStarted","Data":"1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8"} Oct 06 12:33:19 crc kubenswrapper[4892]: I1006 12:33:19.965247 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9qc26" podStartSLOduration=2.431985929 podStartE2EDuration="4.965223874s" podCreationTimestamp="2025-10-06 12:33:15 +0000 UTC" firstStartedPulling="2025-10-06 12:33:16.903784181 +0000 UTC m=+1483.453489956" lastFinishedPulling="2025-10-06 12:33:19.437022136 +0000 UTC m=+1485.986727901" observedRunningTime="2025-10-06 12:33:19.96369986 +0000 UTC m=+1486.513405645" watchObservedRunningTime="2025-10-06 12:33:19.965223874 +0000 UTC m=+1486.514929649" Oct 06 12:33:22 crc kubenswrapper[4892]: I1006 12:33:22.984345 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:33:22 crc kubenswrapper[4892]: I1006 12:33:22.984937 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:33:22 crc kubenswrapper[4892]: I1006 12:33:22.984996 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:33:22 crc kubenswrapper[4892]: I1006 12:33:22.985999 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e6bec4311317cf3d786aab7279e92bdb6ecd5789603229094ff6446f6367943"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:33:22 crc kubenswrapper[4892]: I1006 12:33:22.986083 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://8e6bec4311317cf3d786aab7279e92bdb6ecd5789603229094ff6446f6367943" gracePeriod=600 Oct 06 12:33:23 crc kubenswrapper[4892]: I1006 12:33:23.989389 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="8e6bec4311317cf3d786aab7279e92bdb6ecd5789603229094ff6446f6367943" exitCode=0 Oct 06 12:33:23 crc kubenswrapper[4892]: I1006 12:33:23.989449 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"8e6bec4311317cf3d786aab7279e92bdb6ecd5789603229094ff6446f6367943"} Oct 06 12:33:23 crc kubenswrapper[4892]: I1006 12:33:23.989730 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262"} Oct 06 12:33:23 crc kubenswrapper[4892]: I1006 12:33:23.989760 4892 scope.go:117] "RemoveContainer" containerID="860dd81af7b9279e259a2bd7600f304a9fac68884adcaaf5b381f360c68fdea5" Oct 06 12:33:25 crc kubenswrapper[4892]: I1006 12:33:25.549680 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:25 crc kubenswrapper[4892]: I1006 12:33:25.551675 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:25 crc kubenswrapper[4892]: I1006 12:33:25.643594 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:26 crc kubenswrapper[4892]: I1006 12:33:26.096645 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:26 crc kubenswrapper[4892]: I1006 12:33:26.158779 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qc26"] Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.052529 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9qc26" podUID="16c9b412-439b-42df-975c-1bd741466d01" containerName="registry-server" containerID="cri-o://1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8" gracePeriod=2 Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.575692 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.648555 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6wqq\" (UniqueName: \"kubernetes.io/projected/16c9b412-439b-42df-975c-1bd741466d01-kube-api-access-d6wqq\") pod \"16c9b412-439b-42df-975c-1bd741466d01\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.648962 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-catalog-content\") pod \"16c9b412-439b-42df-975c-1bd741466d01\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.648989 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-utilities\") pod \"16c9b412-439b-42df-975c-1bd741466d01\" (UID: \"16c9b412-439b-42df-975c-1bd741466d01\") " Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.649868 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-utilities" (OuterVolumeSpecName: "utilities") pod "16c9b412-439b-42df-975c-1bd741466d01" (UID: "16c9b412-439b-42df-975c-1bd741466d01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.656852 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c9b412-439b-42df-975c-1bd741466d01-kube-api-access-d6wqq" (OuterVolumeSpecName: "kube-api-access-d6wqq") pod "16c9b412-439b-42df-975c-1bd741466d01" (UID: "16c9b412-439b-42df-975c-1bd741466d01"). InnerVolumeSpecName "kube-api-access-d6wqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.689358 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16c9b412-439b-42df-975c-1bd741466d01" (UID: "16c9b412-439b-42df-975c-1bd741466d01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.750981 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6wqq\" (UniqueName: \"kubernetes.io/projected/16c9b412-439b-42df-975c-1bd741466d01-kube-api-access-d6wqq\") on node \"crc\" DevicePath \"\"" Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.751013 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:33:28 crc kubenswrapper[4892]: I1006 12:33:28.751024 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16c9b412-439b-42df-975c-1bd741466d01-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.067795 4892 generic.go:334] "Generic (PLEG): container finished" podID="16c9b412-439b-42df-975c-1bd741466d01" containerID="1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8" exitCode=0 Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.067864 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qc26" event={"ID":"16c9b412-439b-42df-975c-1bd741466d01","Type":"ContainerDied","Data":"1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8"} Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.067953 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9qc26" event={"ID":"16c9b412-439b-42df-975c-1bd741466d01","Type":"ContainerDied","Data":"fe0618bb3f7c20e9c43435462385ce8a514fcf090ea009fc75f84cb774386668"} Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.067993 4892 scope.go:117] "RemoveContainer" containerID="1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.069583 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9qc26" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.115497 4892 scope.go:117] "RemoveContainer" containerID="6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.132235 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9qc26"] Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.143047 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9qc26"] Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.162603 4892 scope.go:117] "RemoveContainer" containerID="8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.208091 4892 scope.go:117] "RemoveContainer" containerID="1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8" Oct 06 12:33:29 crc kubenswrapper[4892]: E1006 12:33:29.208681 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8\": container with ID starting with 1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8 not found: ID does not exist" containerID="1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.208735 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8"} err="failed to get container status \"1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8\": rpc error: code = NotFound desc = could not find container \"1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8\": container with ID starting with 1124d3e715f6c3f5a830dc9489c7c185e0e6159484393f8a7655c621d92ebee8 not found: ID does not exist" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.208776 4892 scope.go:117] "RemoveContainer" containerID="6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc" Oct 06 12:33:29 crc kubenswrapper[4892]: E1006 12:33:29.209277 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc\": container with ID starting with 6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc not found: ID does not exist" containerID="6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.209425 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc"} err="failed to get container status \"6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc\": rpc error: code = NotFound desc = could not find container \"6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc\": container with ID starting with 6578a32db663d483b736a190c6245664ad891f0ee9fafb47ddd4d1b3fbba7bfc not found: ID does not exist" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.209542 4892 scope.go:117] "RemoveContainer" containerID="8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2" Oct 06 12:33:29 crc kubenswrapper[4892]: E1006 12:33:29.209957 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2\": container with ID starting with 8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2 not found: ID does not exist" containerID="8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2" Oct 06 12:33:29 crc kubenswrapper[4892]: I1006 12:33:29.209999 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2"} err="failed to get container status \"8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2\": rpc error: code = NotFound desc = could not find container \"8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2\": container with ID starting with 8f539a08cccd4cd5298c0699d1f3a6441c07f6e4ec94e3e21acf522549df4ec2 not found: ID does not exist" Oct 06 12:33:30 crc kubenswrapper[4892]: I1006 12:33:30.200278 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c9b412-439b-42df-975c-1bd741466d01" path="/var/lib/kubelet/pods/16c9b412-439b-42df-975c-1bd741466d01/volumes" Oct 06 12:33:59 crc kubenswrapper[4892]: I1006 12:33:59.865961 4892 scope.go:117] "RemoveContainer" containerID="4c4118ce3a92264a7a65267fe65ae69f17c7790f9887cfafc5cb9301606868b4" Oct 06 12:34:37 crc kubenswrapper[4892]: I1006 12:34:37.895366 4892 generic.go:334] "Generic (PLEG): container finished" podID="de624448-d17e-48b7-a11b-bcbd70fa860f" containerID="2ce270197dc64fda966b8d07709cb05bb8c37593f55236298f581cd679f95c57" exitCode=0 Oct 06 12:34:37 crc kubenswrapper[4892]: I1006 12:34:37.895480 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" event={"ID":"de624448-d17e-48b7-a11b-bcbd70fa860f","Type":"ContainerDied","Data":"2ce270197dc64fda966b8d07709cb05bb8c37593f55236298f581cd679f95c57"} Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.436492 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.599728 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-ssh-key\") pod \"de624448-d17e-48b7-a11b-bcbd70fa860f\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.599956 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-inventory\") pod \"de624448-d17e-48b7-a11b-bcbd70fa860f\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.600087 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2dgv\" (UniqueName: \"kubernetes.io/projected/de624448-d17e-48b7-a11b-bcbd70fa860f-kube-api-access-h2dgv\") pod \"de624448-d17e-48b7-a11b-bcbd70fa860f\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.600196 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-bootstrap-combined-ca-bundle\") pod \"de624448-d17e-48b7-a11b-bcbd70fa860f\" (UID: \"de624448-d17e-48b7-a11b-bcbd70fa860f\") " Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.607023 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de624448-d17e-48b7-a11b-bcbd70fa860f-kube-api-access-h2dgv" (OuterVolumeSpecName: "kube-api-access-h2dgv") pod "de624448-d17e-48b7-a11b-bcbd70fa860f" (UID: "de624448-d17e-48b7-a11b-bcbd70fa860f"). InnerVolumeSpecName "kube-api-access-h2dgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.608009 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "de624448-d17e-48b7-a11b-bcbd70fa860f" (UID: "de624448-d17e-48b7-a11b-bcbd70fa860f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.650586 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-inventory" (OuterVolumeSpecName: "inventory") pod "de624448-d17e-48b7-a11b-bcbd70fa860f" (UID: "de624448-d17e-48b7-a11b-bcbd70fa860f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.658054 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "de624448-d17e-48b7-a11b-bcbd70fa860f" (UID: "de624448-d17e-48b7-a11b-bcbd70fa860f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.703817 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2dgv\" (UniqueName: \"kubernetes.io/projected/de624448-d17e-48b7-a11b-bcbd70fa860f-kube-api-access-h2dgv\") on node \"crc\" DevicePath \"\"" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.703867 4892 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.703887 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.703911 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de624448-d17e-48b7-a11b-bcbd70fa860f-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.921126 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" event={"ID":"de624448-d17e-48b7-a11b-bcbd70fa860f","Type":"ContainerDied","Data":"1347dda7a239c02a87e82421fc977bb16b57205e2b8fad3e05494ed53788e7ff"} Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.921186 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1347dda7a239c02a87e82421fc977bb16b57205e2b8fad3e05494ed53788e7ff" Oct 06 12:34:39 crc kubenswrapper[4892]: I1006 12:34:39.921671 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.062106 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl"] Oct 06 12:34:40 crc kubenswrapper[4892]: E1006 12:34:40.062822 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c9b412-439b-42df-975c-1bd741466d01" containerName="registry-server" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.062910 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c9b412-439b-42df-975c-1bd741466d01" containerName="registry-server" Oct 06 12:34:40 crc kubenswrapper[4892]: E1006 12:34:40.063057 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c9b412-439b-42df-975c-1bd741466d01" containerName="extract-utilities" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.063165 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c9b412-439b-42df-975c-1bd741466d01" containerName="extract-utilities" Oct 06 12:34:40 crc kubenswrapper[4892]: E1006 12:34:40.063273 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16c9b412-439b-42df-975c-1bd741466d01" containerName="extract-content" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.063377 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="16c9b412-439b-42df-975c-1bd741466d01" containerName="extract-content" Oct 06 12:34:40 crc kubenswrapper[4892]: E1006 12:34:40.063468 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de624448-d17e-48b7-a11b-bcbd70fa860f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.063539 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="de624448-d17e-48b7-a11b-bcbd70fa860f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.063845 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="de624448-d17e-48b7-a11b-bcbd70fa860f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.063938 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="16c9b412-439b-42df-975c-1bd741466d01" containerName="registry-server" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.065413 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.068873 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.069525 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.069737 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.069920 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.078486 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl"] Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.214849 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.214969 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mckdd\" (UniqueName: \"kubernetes.io/projected/14a48578-0a22-4ebc-b227-a5fa1ffca71a-kube-api-access-mckdd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.215045 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.317112 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mckdd\" (UniqueName: \"kubernetes.io/projected/14a48578-0a22-4ebc-b227-a5fa1ffca71a-kube-api-access-mckdd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.317227 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.317367 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.322480 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.323227 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.349703 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mckdd\" (UniqueName: \"kubernetes.io/projected/14a48578-0a22-4ebc-b227-a5fa1ffca71a-kube-api-access-mckdd\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:40 crc kubenswrapper[4892]: I1006 12:34:40.390448 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:34:41 crc kubenswrapper[4892]: I1006 12:34:41.096836 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 12:34:41 crc kubenswrapper[4892]: I1006 12:34:41.099412 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl"] Oct 06 12:34:41 crc kubenswrapper[4892]: I1006 12:34:41.946302 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" event={"ID":"14a48578-0a22-4ebc-b227-a5fa1ffca71a","Type":"ContainerStarted","Data":"e3e517365a0eb5a3ffe5842599971261de1cdcccb38777b1f0b6f8309e84577e"} Oct 06 12:34:42 crc kubenswrapper[4892]: I1006 12:34:42.963585 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" event={"ID":"14a48578-0a22-4ebc-b227-a5fa1ffca71a","Type":"ContainerStarted","Data":"0369195be1aefec216c0daae7a361f628f230f710084df8d09ab1f034e456720"} Oct 06 12:34:43 crc kubenswrapper[4892]: I1006 12:34:43.000368 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" podStartSLOduration=2.307370856 podStartE2EDuration="3.000342935s" podCreationTimestamp="2025-10-06 12:34:40 +0000 UTC" firstStartedPulling="2025-10-06 12:34:41.09665742 +0000 UTC m=+1567.646363185" lastFinishedPulling="2025-10-06 12:34:41.789629459 +0000 UTC m=+1568.339335264" observedRunningTime="2025-10-06 12:34:42.98837266 +0000 UTC m=+1569.538078465" watchObservedRunningTime="2025-10-06 12:34:43.000342935 +0000 UTC m=+1569.550048730" Oct 06 12:34:59 crc kubenswrapper[4892]: I1006 12:34:59.978697 4892 scope.go:117] "RemoveContainer" containerID="cfad99feca6c74434b5ca451369cb5b5190835929a02390324a9135835bfbb95" Oct 06 12:35:00 crc kubenswrapper[4892]: I1006 12:35:00.008273 4892 scope.go:117] "RemoveContainer" containerID="3a0ff92d2b15287d4f53d31677923680c3531cefe8e9360909ce2db12478679f" Oct 06 12:35:16 crc kubenswrapper[4892]: I1006 12:35:16.065840 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-t6bmr"] Oct 06 12:35:16 crc kubenswrapper[4892]: I1006 12:35:16.082477 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-t6bmr"] Oct 06 12:35:16 crc kubenswrapper[4892]: I1006 12:35:16.183622 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a660a565-71b7-4fd3-8864-f633a0dc1240" path="/var/lib/kubelet/pods/a660a565-71b7-4fd3-8864-f633a0dc1240/volumes" Oct 06 12:35:17 crc kubenswrapper[4892]: I1006 12:35:17.045811 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-mzrvr"] Oct 06 12:35:17 crc kubenswrapper[4892]: I1006 12:35:17.077939 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vgr8m"] Oct 06 12:35:17 crc kubenswrapper[4892]: I1006 12:35:17.089864 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pnfbr"] Oct 06 12:35:17 crc kubenswrapper[4892]: I1006 12:35:17.099867 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vgr8m"] Oct 06 12:35:17 crc kubenswrapper[4892]: I1006 12:35:17.108414 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-mzrvr"] Oct 06 12:35:17 crc kubenswrapper[4892]: I1006 12:35:17.115762 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pnfbr"] Oct 06 12:35:18 crc kubenswrapper[4892]: I1006 12:35:18.193809 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30982372-73ba-48f1-b3b3-541d8c51d6ce" path="/var/lib/kubelet/pods/30982372-73ba-48f1-b3b3-541d8c51d6ce/volumes" Oct 06 12:35:18 crc kubenswrapper[4892]: I1006 12:35:18.196131 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5823de-6b73-4608-b37e-031dc44dc68b" path="/var/lib/kubelet/pods/7a5823de-6b73-4608-b37e-031dc44dc68b/volumes" Oct 06 12:35:18 crc kubenswrapper[4892]: I1006 12:35:18.198101 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9aee1c6-1d4d-4fd4-9aee-2760312e0e63" path="/var/lib/kubelet/pods/b9aee1c6-1d4d-4fd4-9aee-2760312e0e63/volumes" Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.064877 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-90fa-account-create-vgbzz"] Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.073751 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-523d-account-create-4mr87"] Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.082273 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-04c9-account-create-mqr4p"] Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.090571 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-90fa-account-create-vgbzz"] Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.097248 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-523d-account-create-4mr87"] Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.104232 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-04c9-account-create-mqr4p"] Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.191274 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d0c75fb-89d3-494d-a468-01293842310b" path="/var/lib/kubelet/pods/3d0c75fb-89d3-494d-a468-01293842310b/volumes" Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.192475 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f7f1fb6-763b-45fa-87dd-027c5397ed92" path="/var/lib/kubelet/pods/5f7f1fb6-763b-45fa-87dd-027c5397ed92/volumes" Oct 06 12:35:32 crc kubenswrapper[4892]: I1006 12:35:32.194098 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a658d1b-748b-4345-881e-54b1369b86d0" path="/var/lib/kubelet/pods/9a658d1b-748b-4345-881e-54b1369b86d0/volumes" Oct 06 12:35:34 crc kubenswrapper[4892]: I1006 12:35:34.052357 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-1d51-account-create-cmxch"] Oct 06 12:35:34 crc kubenswrapper[4892]: I1006 12:35:34.067281 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-1d51-account-create-cmxch"] Oct 06 12:35:34 crc kubenswrapper[4892]: I1006 12:35:34.204407 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d216276-2620-49c2-8be9-05784aca5d45" path="/var/lib/kubelet/pods/2d216276-2620-49c2-8be9-05784aca5d45/volumes" Oct 06 12:35:52 crc kubenswrapper[4892]: I1006 12:35:52.983994 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:35:52 crc kubenswrapper[4892]: I1006 12:35:52.984743 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:35:54 crc kubenswrapper[4892]: I1006 12:35:54.078874 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-zqblg"] Oct 06 12:35:54 crc kubenswrapper[4892]: I1006 12:35:54.098764 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cccks"] Oct 06 12:35:54 crc kubenswrapper[4892]: I1006 12:35:54.111437 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-zqblg"] Oct 06 12:35:54 crc kubenswrapper[4892]: I1006 12:35:54.121577 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cccks"] Oct 06 12:35:54 crc kubenswrapper[4892]: I1006 12:35:54.187412 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c01bdac-6171-4739-8a4d-93e871a3cbe4" path="/var/lib/kubelet/pods/2c01bdac-6171-4739-8a4d-93e871a3cbe4/volumes" Oct 06 12:35:54 crc kubenswrapper[4892]: I1006 12:35:54.188148 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d54d8a4-1c52-4e19-a689-e68c7f751bbd" path="/var/lib/kubelet/pods/2d54d8a4-1c52-4e19-a689-e68c7f751bbd/volumes" Oct 06 12:35:55 crc kubenswrapper[4892]: I1006 12:35:55.048045 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-ph7rn"] Oct 06 12:35:55 crc kubenswrapper[4892]: I1006 12:35:55.059725 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-ph7rn"] Oct 06 12:35:56 crc kubenswrapper[4892]: I1006 12:35:56.188512 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2bdc381-3838-44ce-bcfc-9cc714973c19" path="/var/lib/kubelet/pods/e2bdc381-3838-44ce-bcfc-9cc714973c19/volumes" Oct 06 12:35:59 crc kubenswrapper[4892]: I1006 12:35:59.041337 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-nvxxf"] Oct 06 12:35:59 crc kubenswrapper[4892]: I1006 12:35:59.059783 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-nvxxf"] Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.091169 4892 scope.go:117] "RemoveContainer" containerID="fcc43a7ac582e3c6a72e7f49c95e35bc0a187b6b8538f5287799dc1c637b80eb" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.126404 4892 scope.go:117] "RemoveContainer" containerID="de7a2f342856eed731d9c75acc4d717f3d0fb8dfc6be121cf100757515c9ba79" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.183766 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c27657-8048-4b06-9079-4e89e71f369e" path="/var/lib/kubelet/pods/21c27657-8048-4b06-9079-4e89e71f369e/volumes" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.184659 4892 scope.go:117] "RemoveContainer" containerID="50177a9bcdc642ce507031c7b9da5ffd8bd1086536674d1c74931054aac365c4" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.240934 4892 scope.go:117] "RemoveContainer" containerID="2a469f1b7a3690e6f78faaeae562c456e62c27df6fbaefb2814211691e64f81a" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.309164 4892 scope.go:117] "RemoveContainer" containerID="c7e1f580ebc331249c116ea91a1fd22f44f3b23aeb06e42884fc192888f4315f" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.379481 4892 scope.go:117] "RemoveContainer" containerID="032bfdb681258912b6d32e8e1384c7156e9becfc6f82fd17ea605531be34a1ef" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.413633 4892 scope.go:117] "RemoveContainer" containerID="57a581dbd135bfbd8eca9715c3f4b99506811f2ae5aa9eb68b4773e70f691ea4" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.438959 4892 scope.go:117] "RemoveContainer" containerID="c8e643b16ef17d42c6ea36a2e772d9f84d7dd21f81526165a2090abb5adf5acf" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.465670 4892 scope.go:117] "RemoveContainer" containerID="1d109eef19210b4534f30b697185078e0fef41e0be75d1cb89d96b91fcad67c8" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.492642 4892 scope.go:117] "RemoveContainer" containerID="d775c5f4f1f8c92b811f7c6014bf303282cda0fcea766855a5889d929c0cddd1" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.517606 4892 scope.go:117] "RemoveContainer" containerID="8fb2675980b207eb06aaf32bef2c8758e2bb924e25fbbe8c6309f61234d58b06" Oct 06 12:36:00 crc kubenswrapper[4892]: I1006 12:36:00.547718 4892 scope.go:117] "RemoveContainer" containerID="85839fc4bc2b4586e150263513f7c8b8240dfc8efd2339a44c3cf4be37e39d0c" Oct 06 12:36:06 crc kubenswrapper[4892]: I1006 12:36:06.030109 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-zjjqh"] Oct 06 12:36:06 crc kubenswrapper[4892]: I1006 12:36:06.038007 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-zjjqh"] Oct 06 12:36:06 crc kubenswrapper[4892]: I1006 12:36:06.181067 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="425bb472-054f-4ce7-8788-f63e794dff02" path="/var/lib/kubelet/pods/425bb472-054f-4ce7-8788-f63e794dff02/volumes" Oct 06 12:36:07 crc kubenswrapper[4892]: I1006 12:36:07.041834 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-pk4nn"] Oct 06 12:36:07 crc kubenswrapper[4892]: I1006 12:36:07.052992 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-pk4nn"] Oct 06 12:36:08 crc kubenswrapper[4892]: I1006 12:36:08.185597 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355f8351-e83b-4a32-83a2-a1c2f3dc9ca3" path="/var/lib/kubelet/pods/355f8351-e83b-4a32-83a2-a1c2f3dc9ca3/volumes" Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.033082 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8e32-account-create-fg2wj"] Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.047232 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66cf-account-create-mbggr"] Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.057464 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-48d7-account-create-7wqqj"] Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.065126 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66cf-account-create-mbggr"] Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.072482 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-48d7-account-create-7wqqj"] Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.079418 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8e32-account-create-fg2wj"] Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.181432 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9933b0-25cc-4543-8266-1ad5e1fd72ff" path="/var/lib/kubelet/pods/ab9933b0-25cc-4543-8266-1ad5e1fd72ff/volumes" Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.181943 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b639fd-a918-4284-973b-3dd64770ca40" path="/var/lib/kubelet/pods/d6b639fd-a918-4284-973b-3dd64770ca40/volumes" Oct 06 12:36:10 crc kubenswrapper[4892]: I1006 12:36:10.182416 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70aa8f3-809f-4f1d-b8dd-8ecee4996fec" path="/var/lib/kubelet/pods/e70aa8f3-809f-4f1d-b8dd-8ecee4996fec/volumes" Oct 06 12:36:15 crc kubenswrapper[4892]: I1006 12:36:15.092136 4892 generic.go:334] "Generic (PLEG): container finished" podID="14a48578-0a22-4ebc-b227-a5fa1ffca71a" containerID="0369195be1aefec216c0daae7a361f628f230f710084df8d09ab1f034e456720" exitCode=0 Oct 06 12:36:15 crc kubenswrapper[4892]: I1006 12:36:15.092262 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" event={"ID":"14a48578-0a22-4ebc-b227-a5fa1ffca71a","Type":"ContainerDied","Data":"0369195be1aefec216c0daae7a361f628f230f710084df8d09ab1f034e456720"} Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.575500 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.777169 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-inventory\") pod \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.777225 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mckdd\" (UniqueName: \"kubernetes.io/projected/14a48578-0a22-4ebc-b227-a5fa1ffca71a-kube-api-access-mckdd\") pod \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.777314 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-ssh-key\") pod \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\" (UID: \"14a48578-0a22-4ebc-b227-a5fa1ffca71a\") " Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.783263 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a48578-0a22-4ebc-b227-a5fa1ffca71a-kube-api-access-mckdd" (OuterVolumeSpecName: "kube-api-access-mckdd") pod "14a48578-0a22-4ebc-b227-a5fa1ffca71a" (UID: "14a48578-0a22-4ebc-b227-a5fa1ffca71a"). InnerVolumeSpecName "kube-api-access-mckdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.814436 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-inventory" (OuterVolumeSpecName: "inventory") pod "14a48578-0a22-4ebc-b227-a5fa1ffca71a" (UID: "14a48578-0a22-4ebc-b227-a5fa1ffca71a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.824639 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14a48578-0a22-4ebc-b227-a5fa1ffca71a" (UID: "14a48578-0a22-4ebc-b227-a5fa1ffca71a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.880781 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.880832 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mckdd\" (UniqueName: \"kubernetes.io/projected/14a48578-0a22-4ebc-b227-a5fa1ffca71a-kube-api-access-mckdd\") on node \"crc\" DevicePath \"\"" Oct 06 12:36:16 crc kubenswrapper[4892]: I1006 12:36:16.880855 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14a48578-0a22-4ebc-b227-a5fa1ffca71a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.121642 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" event={"ID":"14a48578-0a22-4ebc-b227-a5fa1ffca71a","Type":"ContainerDied","Data":"e3e517365a0eb5a3ffe5842599971261de1cdcccb38777b1f0b6f8309e84577e"} Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.121758 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3e517365a0eb5a3ffe5842599971261de1cdcccb38777b1f0b6f8309e84577e" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.122115 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.217515 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5"] Oct 06 12:36:17 crc kubenswrapper[4892]: E1006 12:36:17.218062 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a48578-0a22-4ebc-b227-a5fa1ffca71a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.218089 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a48578-0a22-4ebc-b227-a5fa1ffca71a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.218473 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a48578-0a22-4ebc-b227-a5fa1ffca71a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.219350 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.221340 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.221383 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.221412 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.222398 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.241358 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5"] Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.392029 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swb6n\" (UniqueName: \"kubernetes.io/projected/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-kube-api-access-swb6n\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.393187 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.393502 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.495423 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swb6n\" (UniqueName: \"kubernetes.io/projected/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-kube-api-access-swb6n\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.495667 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.495885 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.503060 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.503089 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.538320 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swb6n\" (UniqueName: \"kubernetes.io/projected/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-kube-api-access-swb6n\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:17 crc kubenswrapper[4892]: I1006 12:36:17.543318 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:36:18 crc kubenswrapper[4892]: I1006 12:36:18.124981 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5"] Oct 06 12:36:18 crc kubenswrapper[4892]: I1006 12:36:18.145823 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" event={"ID":"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed","Type":"ContainerStarted","Data":"3804584a970b5a2fcf8ad25025308314986b250db35aed827a592baa73ef472e"} Oct 06 12:36:19 crc kubenswrapper[4892]: I1006 12:36:19.159067 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" event={"ID":"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed","Type":"ContainerStarted","Data":"6615ca4d6d4edfe76dd57b163b5a568007f0000cafbd6f2730fb8d4a5cc7ef47"} Oct 06 12:36:22 crc kubenswrapper[4892]: I1006 12:36:22.984205 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:36:22 crc kubenswrapper[4892]: I1006 12:36:22.984636 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:36:33 crc kubenswrapper[4892]: I1006 12:36:33.052210 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" podStartSLOduration=15.595220285 podStartE2EDuration="16.052186662s" podCreationTimestamp="2025-10-06 12:36:17 +0000 UTC" firstStartedPulling="2025-10-06 12:36:18.130233035 +0000 UTC m=+1664.679938800" lastFinishedPulling="2025-10-06 12:36:18.587199372 +0000 UTC m=+1665.136905177" observedRunningTime="2025-10-06 12:36:19.182828927 +0000 UTC m=+1665.732534702" watchObservedRunningTime="2025-10-06 12:36:33.052186662 +0000 UTC m=+1679.601892437" Oct 06 12:36:33 crc kubenswrapper[4892]: I1006 12:36:33.060362 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jps9b"] Oct 06 12:36:33 crc kubenswrapper[4892]: I1006 12:36:33.080470 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jps9b"] Oct 06 12:36:34 crc kubenswrapper[4892]: I1006 12:36:34.040799 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rqpxp"] Oct 06 12:36:34 crc kubenswrapper[4892]: I1006 12:36:34.053134 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rqpxp"] Oct 06 12:36:34 crc kubenswrapper[4892]: I1006 12:36:34.200990 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2673cbbe-dc84-4a24-a48a-303029fcc02a" path="/var/lib/kubelet/pods/2673cbbe-dc84-4a24-a48a-303029fcc02a/volumes" Oct 06 12:36:34 crc kubenswrapper[4892]: I1006 12:36:34.202189 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657b347a-9a82-404a-b263-f51befcd5837" path="/var/lib/kubelet/pods/657b347a-9a82-404a-b263-f51befcd5837/volumes" Oct 06 12:36:41 crc kubenswrapper[4892]: I1006 12:36:41.033478 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8t56w"] Oct 06 12:36:41 crc kubenswrapper[4892]: I1006 12:36:41.041965 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8t56w"] Oct 06 12:36:42 crc kubenswrapper[4892]: I1006 12:36:42.184371 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de602d1-1bde-4049-88a7-d8132dee5d53" path="/var/lib/kubelet/pods/5de602d1-1bde-4049-88a7-d8132dee5d53/volumes" Oct 06 12:36:52 crc kubenswrapper[4892]: I1006 12:36:52.984665 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:36:52 crc kubenswrapper[4892]: I1006 12:36:52.985346 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:36:52 crc kubenswrapper[4892]: I1006 12:36:52.985397 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:36:52 crc kubenswrapper[4892]: I1006 12:36:52.986207 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:36:52 crc kubenswrapper[4892]: I1006 12:36:52.986273 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" gracePeriod=600 Oct 06 12:36:53 crc kubenswrapper[4892]: I1006 12:36:53.049137 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-knkh9"] Oct 06 12:36:53 crc kubenswrapper[4892]: I1006 12:36:53.058989 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-knkh9"] Oct 06 12:36:53 crc kubenswrapper[4892]: E1006 12:36:53.114424 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:36:53 crc kubenswrapper[4892]: I1006 12:36:53.543449 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" exitCode=0 Oct 06 12:36:53 crc kubenswrapper[4892]: I1006 12:36:53.543509 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262"} Oct 06 12:36:53 crc kubenswrapper[4892]: I1006 12:36:53.543554 4892 scope.go:117] "RemoveContainer" containerID="8e6bec4311317cf3d786aab7279e92bdb6ecd5789603229094ff6446f6367943" Oct 06 12:36:53 crc kubenswrapper[4892]: I1006 12:36:53.544521 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:36:53 crc kubenswrapper[4892]: E1006 12:36:53.545125 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:36:54 crc kubenswrapper[4892]: I1006 12:36:54.188782 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de86e5ee-d52e-4d8b-8077-a0d86175878c" path="/var/lib/kubelet/pods/de86e5ee-d52e-4d8b-8077-a0d86175878c/volumes" Oct 06 12:37:00 crc kubenswrapper[4892]: I1006 12:37:00.818732 4892 scope.go:117] "RemoveContainer" containerID="fe14ac778c8b08fc05e6b34f35457cb0af828a54d5a9e0885b0a36ccedeb2a37" Oct 06 12:37:00 crc kubenswrapper[4892]: I1006 12:37:00.847754 4892 scope.go:117] "RemoveContainer" containerID="b2efbc72e8e26c11fc05ed783b9dd9ba1b5a6b45bd15433bbe4472edf69efd43" Oct 06 12:37:00 crc kubenswrapper[4892]: I1006 12:37:00.914488 4892 scope.go:117] "RemoveContainer" containerID="0e79e6fdd772badf78468b0c2aeff4552c4709b13d72819e5b2e3cdd3c33b786" Oct 06 12:37:00 crc kubenswrapper[4892]: I1006 12:37:00.972955 4892 scope.go:117] "RemoveContainer" containerID="f2d055771567f8d238078fe05a09537ba536dbaf1185c3106edc7215a2538356" Oct 06 12:37:01 crc kubenswrapper[4892]: I1006 12:37:01.006124 4892 scope.go:117] "RemoveContainer" containerID="b83501735257d54d9d2d40c03459e09b3dbfd6e3c193cc6ad510c08d53ca7fd5" Oct 06 12:37:01 crc kubenswrapper[4892]: I1006 12:37:01.080228 4892 scope.go:117] "RemoveContainer" containerID="c8649c4a844019867546376f544e67b95ac672156102e7d197d5fe6c9dfd3682" Oct 06 12:37:01 crc kubenswrapper[4892]: I1006 12:37:01.114768 4892 scope.go:117] "RemoveContainer" containerID="3c2a02d34a6fd831789c5c2c29b8643e0b8a93f9f084d92901abee5e94d3a7d3" Oct 06 12:37:01 crc kubenswrapper[4892]: I1006 12:37:01.145556 4892 scope.go:117] "RemoveContainer" containerID="56cc6a8ccdb7232475990d2549c4699fc09caec18349135f904c1dde40002310" Oct 06 12:37:01 crc kubenswrapper[4892]: I1006 12:37:01.170988 4892 scope.go:117] "RemoveContainer" containerID="e7522be5cdf0eb62acdf83af5048209403247bb5accda3c82d2aeeeea284ed4f" Oct 06 12:37:06 crc kubenswrapper[4892]: I1006 12:37:06.170267 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:37:06 crc kubenswrapper[4892]: E1006 12:37:06.171827 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:37:14 crc kubenswrapper[4892]: I1006 12:37:14.038389 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rk9bz"] Oct 06 12:37:14 crc kubenswrapper[4892]: I1006 12:37:14.051939 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rk9bz"] Oct 06 12:37:14 crc kubenswrapper[4892]: I1006 12:37:14.188586 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea" path="/var/lib/kubelet/pods/eb654fe9-6aa6-4b10-88b8-9da5b1dac9ea/volumes" Oct 06 12:37:19 crc kubenswrapper[4892]: I1006 12:37:19.169671 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:37:19 crc kubenswrapper[4892]: E1006 12:37:19.170891 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:37:29 crc kubenswrapper[4892]: I1006 12:37:29.049378 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-g72j5"] Oct 06 12:37:29 crc kubenswrapper[4892]: I1006 12:37:29.059189 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hbr5f"] Oct 06 12:37:29 crc kubenswrapper[4892]: I1006 12:37:29.068365 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-q7svw"] Oct 06 12:37:29 crc kubenswrapper[4892]: I1006 12:37:29.076823 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hbr5f"] Oct 06 12:37:29 crc kubenswrapper[4892]: I1006 12:37:29.083616 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-g72j5"] Oct 06 12:37:29 crc kubenswrapper[4892]: I1006 12:37:29.090971 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-q7svw"] Oct 06 12:37:30 crc kubenswrapper[4892]: I1006 12:37:30.189821 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367eb665-c929-4ea9-8fb5-cd23cd430278" path="/var/lib/kubelet/pods/367eb665-c929-4ea9-8fb5-cd23cd430278/volumes" Oct 06 12:37:30 crc kubenswrapper[4892]: I1006 12:37:30.191162 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63c64d6-eb6d-4370-abd8-b4eb8487ae8a" path="/var/lib/kubelet/pods/b63c64d6-eb6d-4370-abd8-b4eb8487ae8a/volumes" Oct 06 12:37:30 crc kubenswrapper[4892]: I1006 12:37:30.192201 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be99d63c-4bdd-4dae-a003-5215816244ac" path="/var/lib/kubelet/pods/be99d63c-4bdd-4dae-a003-5215816244ac/volumes" Oct 06 12:37:32 crc kubenswrapper[4892]: I1006 12:37:32.170869 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:37:32 crc kubenswrapper[4892]: E1006 12:37:32.171454 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:37:33 crc kubenswrapper[4892]: I1006 12:37:33.001637 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" event={"ID":"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed","Type":"ContainerDied","Data":"6615ca4d6d4edfe76dd57b163b5a568007f0000cafbd6f2730fb8d4a5cc7ef47"} Oct 06 12:37:33 crc kubenswrapper[4892]: I1006 12:37:33.001719 4892 generic.go:334] "Generic (PLEG): container finished" podID="f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed" containerID="6615ca4d6d4edfe76dd57b163b5a568007f0000cafbd6f2730fb8d4a5cc7ef47" exitCode=0 Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.030137 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c7bd-account-create-xgwv5"] Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.039950 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c7bd-account-create-xgwv5"] Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.183290 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4541a67b-e69c-4650-9de1-db5abe24d73b" path="/var/lib/kubelet/pods/4541a67b-e69c-4650-9de1-db5abe24d73b/volumes" Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.433044 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.455810 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-inventory\") pod \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.455903 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-ssh-key\") pod \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.456034 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swb6n\" (UniqueName: \"kubernetes.io/projected/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-kube-api-access-swb6n\") pod \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\" (UID: \"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed\") " Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.464500 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-kube-api-access-swb6n" (OuterVolumeSpecName: "kube-api-access-swb6n") pod "f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed" (UID: "f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed"). InnerVolumeSpecName "kube-api-access-swb6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.486502 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-inventory" (OuterVolumeSpecName: "inventory") pod "f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed" (UID: "f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.521475 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed" (UID: "f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.558431 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.558465 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:37:34 crc kubenswrapper[4892]: I1006 12:37:34.558478 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swb6n\" (UniqueName: \"kubernetes.io/projected/f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed-kube-api-access-swb6n\") on node \"crc\" DevicePath \"\"" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.032736 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" event={"ID":"f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed","Type":"ContainerDied","Data":"3804584a970b5a2fcf8ad25025308314986b250db35aed827a592baa73ef472e"} Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.032799 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.032799 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3804584a970b5a2fcf8ad25025308314986b250db35aed827a592baa73ef472e" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.129975 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n"] Oct 06 12:37:35 crc kubenswrapper[4892]: E1006 12:37:35.130465 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.130492 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.130750 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.131727 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.135426 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.135741 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.135984 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.136567 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.143624 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n"] Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.173951 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4656\" (UniqueName: \"kubernetes.io/projected/392cffb3-245d-4f4a-86eb-81e59a488996-kube-api-access-l4656\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.174016 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.174141 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.275844 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.275962 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4656\" (UniqueName: \"kubernetes.io/projected/392cffb3-245d-4f4a-86eb-81e59a488996-kube-api-access-l4656\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.275999 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.279996 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.287294 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.298143 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4656\" (UniqueName: \"kubernetes.io/projected/392cffb3-245d-4f4a-86eb-81e59a488996-kube-api-access-l4656\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.459560 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:35 crc kubenswrapper[4892]: I1006 12:37:35.998205 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n"] Oct 06 12:37:36 crc kubenswrapper[4892]: I1006 12:37:36.050701 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" event={"ID":"392cffb3-245d-4f4a-86eb-81e59a488996","Type":"ContainerStarted","Data":"2ead24d562d97b5b233aab9d5b16abc2cc5ee10819badb11c92d90293f7f6c67"} Oct 06 12:37:37 crc kubenswrapper[4892]: I1006 12:37:37.066586 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" event={"ID":"392cffb3-245d-4f4a-86eb-81e59a488996","Type":"ContainerStarted","Data":"9580459353be58087ed657802530b9764af380fbcc386a80b90ff1fb4e977f47"} Oct 06 12:37:37 crc kubenswrapper[4892]: I1006 12:37:37.103482 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" podStartSLOduration=1.621809293 podStartE2EDuration="2.103455544s" podCreationTimestamp="2025-10-06 12:37:35 +0000 UTC" firstStartedPulling="2025-10-06 12:37:36.007198437 +0000 UTC m=+1742.556904212" lastFinishedPulling="2025-10-06 12:37:36.488844668 +0000 UTC m=+1743.038550463" observedRunningTime="2025-10-06 12:37:37.095686189 +0000 UTC m=+1743.645391994" watchObservedRunningTime="2025-10-06 12:37:37.103455544 +0000 UTC m=+1743.653161319" Oct 06 12:37:42 crc kubenswrapper[4892]: I1006 12:37:42.126241 4892 generic.go:334] "Generic (PLEG): container finished" podID="392cffb3-245d-4f4a-86eb-81e59a488996" containerID="9580459353be58087ed657802530b9764af380fbcc386a80b90ff1fb4e977f47" exitCode=0 Oct 06 12:37:42 crc kubenswrapper[4892]: I1006 12:37:42.126395 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" event={"ID":"392cffb3-245d-4f4a-86eb-81e59a488996","Type":"ContainerDied","Data":"9580459353be58087ed657802530b9764af380fbcc386a80b90ff1fb4e977f47"} Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.642302 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.755860 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-ssh-key\") pod \"392cffb3-245d-4f4a-86eb-81e59a488996\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.755906 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4656\" (UniqueName: \"kubernetes.io/projected/392cffb3-245d-4f4a-86eb-81e59a488996-kube-api-access-l4656\") pod \"392cffb3-245d-4f4a-86eb-81e59a488996\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.755954 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-inventory\") pod \"392cffb3-245d-4f4a-86eb-81e59a488996\" (UID: \"392cffb3-245d-4f4a-86eb-81e59a488996\") " Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.763463 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/392cffb3-245d-4f4a-86eb-81e59a488996-kube-api-access-l4656" (OuterVolumeSpecName: "kube-api-access-l4656") pod "392cffb3-245d-4f4a-86eb-81e59a488996" (UID: "392cffb3-245d-4f4a-86eb-81e59a488996"). InnerVolumeSpecName "kube-api-access-l4656". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.789791 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "392cffb3-245d-4f4a-86eb-81e59a488996" (UID: "392cffb3-245d-4f4a-86eb-81e59a488996"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.814116 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-inventory" (OuterVolumeSpecName: "inventory") pod "392cffb3-245d-4f4a-86eb-81e59a488996" (UID: "392cffb3-245d-4f4a-86eb-81e59a488996"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.859207 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.859260 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/392cffb3-245d-4f4a-86eb-81e59a488996-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:37:43 crc kubenswrapper[4892]: I1006 12:37:43.859279 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4656\" (UniqueName: \"kubernetes.io/projected/392cffb3-245d-4f4a-86eb-81e59a488996-kube-api-access-l4656\") on node \"crc\" DevicePath \"\"" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.028834 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-2416-account-create-8kmnp"] Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.037520 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9231-account-create-pt6vz"] Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.046941 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-2416-account-create-8kmnp"] Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.054714 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9231-account-create-pt6vz"] Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.156241 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" event={"ID":"392cffb3-245d-4f4a-86eb-81e59a488996","Type":"ContainerDied","Data":"2ead24d562d97b5b233aab9d5b16abc2cc5ee10819badb11c92d90293f7f6c67"} Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.156516 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ead24d562d97b5b233aab9d5b16abc2cc5ee10819badb11c92d90293f7f6c67" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.156467 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.186047 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:37:44 crc kubenswrapper[4892]: E1006 12:37:44.186343 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.189043 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34fa2c05-cfbf-4c6d-ad52-01568460df84" path="/var/lib/kubelet/pods/34fa2c05-cfbf-4c6d-ad52-01568460df84/volumes" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.190129 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1cfcdc0-6f22-4778-9a8f-c050eba27482" path="/var/lib/kubelet/pods/a1cfcdc0-6f22-4778-9a8f-c050eba27482/volumes" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.232276 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh"] Oct 06 12:37:44 crc kubenswrapper[4892]: E1006 12:37:44.232943 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="392cffb3-245d-4f4a-86eb-81e59a488996" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.233077 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="392cffb3-245d-4f4a-86eb-81e59a488996" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.235971 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="392cffb3-245d-4f4a-86eb-81e59a488996" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.237401 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.239858 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.240100 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.240341 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.244940 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.279658 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh"] Oct 06 12:37:44 crc kubenswrapper[4892]: E1006 12:37:44.336946 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392cffb3_245d_4f4a_86eb_81e59a488996.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod392cffb3_245d_4f4a_86eb_81e59a488996.slice/crio-2ead24d562d97b5b233aab9d5b16abc2cc5ee10819badb11c92d90293f7f6c67\": RecentStats: unable to find data in memory cache]" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.381467 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.381560 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frlvh\" (UniqueName: \"kubernetes.io/projected/c8b52544-e7f5-4cab-9b11-1bf028d07c61-kube-api-access-frlvh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.381996 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.483519 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.483635 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.483703 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frlvh\" (UniqueName: \"kubernetes.io/projected/c8b52544-e7f5-4cab-9b11-1bf028d07c61-kube-api-access-frlvh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.490027 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.492807 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.513919 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frlvh\" (UniqueName: \"kubernetes.io/projected/c8b52544-e7f5-4cab-9b11-1bf028d07c61-kube-api-access-frlvh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-jw9jh\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:44 crc kubenswrapper[4892]: I1006 12:37:44.567191 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:37:45 crc kubenswrapper[4892]: I1006 12:37:45.167943 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh"] Oct 06 12:37:45 crc kubenswrapper[4892]: W1006 12:37:45.173010 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b52544_e7f5_4cab_9b11_1bf028d07c61.slice/crio-e4eb8bbeba9a13d290240bb4a1d7c53232a1265fddab7680b1f397b892c42724 WatchSource:0}: Error finding container e4eb8bbeba9a13d290240bb4a1d7c53232a1265fddab7680b1f397b892c42724: Status 404 returned error can't find the container with id e4eb8bbeba9a13d290240bb4a1d7c53232a1265fddab7680b1f397b892c42724 Oct 06 12:37:46 crc kubenswrapper[4892]: I1006 12:37:46.181728 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" event={"ID":"c8b52544-e7f5-4cab-9b11-1bf028d07c61","Type":"ContainerStarted","Data":"92c0ed260ddf45aee04d2fbab8f702fb00b8449f427620e40aa82179d0f89a61"} Oct 06 12:37:46 crc kubenswrapper[4892]: I1006 12:37:46.182302 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" event={"ID":"c8b52544-e7f5-4cab-9b11-1bf028d07c61","Type":"ContainerStarted","Data":"e4eb8bbeba9a13d290240bb4a1d7c53232a1265fddab7680b1f397b892c42724"} Oct 06 12:37:46 crc kubenswrapper[4892]: I1006 12:37:46.197035 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" podStartSLOduration=1.740572862 podStartE2EDuration="2.197005973s" podCreationTimestamp="2025-10-06 12:37:44 +0000 UTC" firstStartedPulling="2025-10-06 12:37:45.176093621 +0000 UTC m=+1751.725799386" lastFinishedPulling="2025-10-06 12:37:45.632526692 +0000 UTC m=+1752.182232497" observedRunningTime="2025-10-06 12:37:46.194028037 +0000 UTC m=+1752.743733812" watchObservedRunningTime="2025-10-06 12:37:46.197005973 +0000 UTC m=+1752.746711738" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.141100 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qgr8s"] Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.143674 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.207398 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qgr8s"] Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.284161 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nfgn\" (UniqueName: \"kubernetes.io/projected/dd04082b-0289-4c3b-9f8d-4c5ac4707163-kube-api-access-9nfgn\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.284281 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-utilities\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.284453 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-catalog-content\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.386449 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-catalog-content\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.386687 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nfgn\" (UniqueName: \"kubernetes.io/projected/dd04082b-0289-4c3b-9f8d-4c5ac4707163-kube-api-access-9nfgn\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.386818 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-utilities\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.387211 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-catalog-content\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.387532 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-utilities\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.414134 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nfgn\" (UniqueName: \"kubernetes.io/projected/dd04082b-0289-4c3b-9f8d-4c5ac4707163-kube-api-access-9nfgn\") pod \"community-operators-qgr8s\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:57 crc kubenswrapper[4892]: I1006 12:37:57.499187 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:37:58 crc kubenswrapper[4892]: I1006 12:37:58.033593 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qgr8s"] Oct 06 12:37:58 crc kubenswrapper[4892]: W1006 12:37:58.047299 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd04082b_0289_4c3b_9f8d_4c5ac4707163.slice/crio-dd2327f9a6e9c070f43a62c835fa37583588af9b9ee6106912e342d3649f1dae WatchSource:0}: Error finding container dd2327f9a6e9c070f43a62c835fa37583588af9b9ee6106912e342d3649f1dae: Status 404 returned error can't find the container with id dd2327f9a6e9c070f43a62c835fa37583588af9b9ee6106912e342d3649f1dae Oct 06 12:37:58 crc kubenswrapper[4892]: I1006 12:37:58.310617 4892 generic.go:334] "Generic (PLEG): container finished" podID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerID="6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94" exitCode=0 Oct 06 12:37:58 crc kubenswrapper[4892]: I1006 12:37:58.310652 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgr8s" event={"ID":"dd04082b-0289-4c3b-9f8d-4c5ac4707163","Type":"ContainerDied","Data":"6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94"} Oct 06 12:37:58 crc kubenswrapper[4892]: I1006 12:37:58.310675 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgr8s" event={"ID":"dd04082b-0289-4c3b-9f8d-4c5ac4707163","Type":"ContainerStarted","Data":"dd2327f9a6e9c070f43a62c835fa37583588af9b9ee6106912e342d3649f1dae"} Oct 06 12:37:59 crc kubenswrapper[4892]: I1006 12:37:59.169575 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:37:59 crc kubenswrapper[4892]: E1006 12:37:59.170633 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:37:59 crc kubenswrapper[4892]: I1006 12:37:59.325088 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgr8s" event={"ID":"dd04082b-0289-4c3b-9f8d-4c5ac4707163","Type":"ContainerStarted","Data":"d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57"} Oct 06 12:38:00 crc kubenswrapper[4892]: I1006 12:38:00.342560 4892 generic.go:334] "Generic (PLEG): container finished" podID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerID="d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57" exitCode=0 Oct 06 12:38:00 crc kubenswrapper[4892]: I1006 12:38:00.342652 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgr8s" event={"ID":"dd04082b-0289-4c3b-9f8d-4c5ac4707163","Type":"ContainerDied","Data":"d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57"} Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.354617 4892 scope.go:117] "RemoveContainer" containerID="9e18ba5ab5c80f829005cdc295de4854d0af6eb32212f092dc3c3457b110d4a3" Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.357717 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgr8s" event={"ID":"dd04082b-0289-4c3b-9f8d-4c5ac4707163","Type":"ContainerStarted","Data":"07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd"} Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.382039 4892 scope.go:117] "RemoveContainer" containerID="f59939b48aa37657cafd0b0c0f232a4eae9dbca6620699a77ce5538aa0471113" Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.412960 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qgr8s" podStartSLOduration=1.622773853 podStartE2EDuration="4.412936739s" podCreationTimestamp="2025-10-06 12:37:57 +0000 UTC" firstStartedPulling="2025-10-06 12:37:58.312703125 +0000 UTC m=+1764.862408890" lastFinishedPulling="2025-10-06 12:38:01.102866011 +0000 UTC m=+1767.652571776" observedRunningTime="2025-10-06 12:38:01.385182984 +0000 UTC m=+1767.934888769" watchObservedRunningTime="2025-10-06 12:38:01.412936739 +0000 UTC m=+1767.962642504" Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.437387 4892 scope.go:117] "RemoveContainer" containerID="41e82361141e836db8e5cf4ba79ef7f46f7ed4e27fb5f38df914c6371d2a99c9" Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.471862 4892 scope.go:117] "RemoveContainer" containerID="5d57a9dcabefb22c4ddb614852f10e80b30da8b8e557bece3963c3c33a2f8296" Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.520893 4892 scope.go:117] "RemoveContainer" containerID="cf408c0fdd683bb3f656bf0cb6313060eac66780bb972b183b8321048737d8a4" Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.583056 4892 scope.go:117] "RemoveContainer" containerID="94ffbba9af6b2f85ea4ab925ef340ab9848563be2d1fd8d14327157208878515" Oct 06 12:38:01 crc kubenswrapper[4892]: I1006 12:38:01.608616 4892 scope.go:117] "RemoveContainer" containerID="dc2ee1d288172ce961b1b45c6874ef8af9788f61038164e0770605e25c3aac82" Oct 06 12:38:07 crc kubenswrapper[4892]: I1006 12:38:07.500194 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:38:07 crc kubenswrapper[4892]: I1006 12:38:07.500849 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:38:07 crc kubenswrapper[4892]: I1006 12:38:07.565359 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:38:08 crc kubenswrapper[4892]: I1006 12:38:08.484238 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:38:08 crc kubenswrapper[4892]: I1006 12:38:08.538089 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qgr8s"] Oct 06 12:38:10 crc kubenswrapper[4892]: I1006 12:38:10.169582 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:38:10 crc kubenswrapper[4892]: E1006 12:38:10.170722 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:38:10 crc kubenswrapper[4892]: I1006 12:38:10.465836 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qgr8s" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerName="registry-server" containerID="cri-o://07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd" gracePeriod=2 Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.070166 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.189823 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nfgn\" (UniqueName: \"kubernetes.io/projected/dd04082b-0289-4c3b-9f8d-4c5ac4707163-kube-api-access-9nfgn\") pod \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.189933 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-catalog-content\") pod \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.189966 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-utilities\") pod \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\" (UID: \"dd04082b-0289-4c3b-9f8d-4c5ac4707163\") " Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.191203 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-utilities" (OuterVolumeSpecName: "utilities") pod "dd04082b-0289-4c3b-9f8d-4c5ac4707163" (UID: "dd04082b-0289-4c3b-9f8d-4c5ac4707163"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.196905 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd04082b-0289-4c3b-9f8d-4c5ac4707163-kube-api-access-9nfgn" (OuterVolumeSpecName: "kube-api-access-9nfgn") pod "dd04082b-0289-4c3b-9f8d-4c5ac4707163" (UID: "dd04082b-0289-4c3b-9f8d-4c5ac4707163"). InnerVolumeSpecName "kube-api-access-9nfgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.257795 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd04082b-0289-4c3b-9f8d-4c5ac4707163" (UID: "dd04082b-0289-4c3b-9f8d-4c5ac4707163"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.291885 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nfgn\" (UniqueName: \"kubernetes.io/projected/dd04082b-0289-4c3b-9f8d-4c5ac4707163-kube-api-access-9nfgn\") on node \"crc\" DevicePath \"\"" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.291915 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.291924 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd04082b-0289-4c3b-9f8d-4c5ac4707163-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.481555 4892 generic.go:334] "Generic (PLEG): container finished" podID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerID="07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd" exitCode=0 Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.481676 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qgr8s" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.481653 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgr8s" event={"ID":"dd04082b-0289-4c3b-9f8d-4c5ac4707163","Type":"ContainerDied","Data":"07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd"} Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.481782 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qgr8s" event={"ID":"dd04082b-0289-4c3b-9f8d-4c5ac4707163","Type":"ContainerDied","Data":"dd2327f9a6e9c070f43a62c835fa37583588af9b9ee6106912e342d3649f1dae"} Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.481844 4892 scope.go:117] "RemoveContainer" containerID="07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.513438 4892 scope.go:117] "RemoveContainer" containerID="d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.548419 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qgr8s"] Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.562962 4892 scope.go:117] "RemoveContainer" containerID="6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.566030 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qgr8s"] Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.636048 4892 scope.go:117] "RemoveContainer" containerID="07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd" Oct 06 12:38:11 crc kubenswrapper[4892]: E1006 12:38:11.636848 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd\": container with ID starting with 07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd not found: ID does not exist" containerID="07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.636889 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd"} err="failed to get container status \"07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd\": rpc error: code = NotFound desc = could not find container \"07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd\": container with ID starting with 07c47d83b99f18e43e5bda1e58724535724f62a0f17e627111cb193f9561c8fd not found: ID does not exist" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.636909 4892 scope.go:117] "RemoveContainer" containerID="d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57" Oct 06 12:38:11 crc kubenswrapper[4892]: E1006 12:38:11.637383 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57\": container with ID starting with d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57 not found: ID does not exist" containerID="d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.637441 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57"} err="failed to get container status \"d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57\": rpc error: code = NotFound desc = could not find container \"d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57\": container with ID starting with d78f3f69394d9d2834c2315f2cbe3ec3376535683250259b033fa3ab627cdc57 not found: ID does not exist" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.637475 4892 scope.go:117] "RemoveContainer" containerID="6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94" Oct 06 12:38:11 crc kubenswrapper[4892]: E1006 12:38:11.637959 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94\": container with ID starting with 6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94 not found: ID does not exist" containerID="6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94" Oct 06 12:38:11 crc kubenswrapper[4892]: I1006 12:38:11.637982 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94"} err="failed to get container status \"6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94\": rpc error: code = NotFound desc = could not find container \"6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94\": container with ID starting with 6dcbead332fd9783d341c6d783f5f8e71812fd8bdf250b05e0d7ba4700213b94 not found: ID does not exist" Oct 06 12:38:12 crc kubenswrapper[4892]: I1006 12:38:12.053423 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmjq4"] Oct 06 12:38:12 crc kubenswrapper[4892]: I1006 12:38:12.064125 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-fmjq4"] Oct 06 12:38:12 crc kubenswrapper[4892]: I1006 12:38:12.180052 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812d3ada-a315-4c39-9f56-bb54525a0df2" path="/var/lib/kubelet/pods/812d3ada-a315-4c39-9f56-bb54525a0df2/volumes" Oct 06 12:38:12 crc kubenswrapper[4892]: I1006 12:38:12.180812 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" path="/var/lib/kubelet/pods/dd04082b-0289-4c3b-9f8d-4c5ac4707163/volumes" Oct 06 12:38:22 crc kubenswrapper[4892]: I1006 12:38:22.170248 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:38:22 crc kubenswrapper[4892]: E1006 12:38:22.171465 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:38:29 crc kubenswrapper[4892]: I1006 12:38:29.699857 4892 generic.go:334] "Generic (PLEG): container finished" podID="c8b52544-e7f5-4cab-9b11-1bf028d07c61" containerID="92c0ed260ddf45aee04d2fbab8f702fb00b8449f427620e40aa82179d0f89a61" exitCode=0 Oct 06 12:38:29 crc kubenswrapper[4892]: I1006 12:38:29.699929 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" event={"ID":"c8b52544-e7f5-4cab-9b11-1bf028d07c61","Type":"ContainerDied","Data":"92c0ed260ddf45aee04d2fbab8f702fb00b8449f427620e40aa82179d0f89a61"} Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.244747 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.328856 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-inventory\") pod \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.329022 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-ssh-key\") pod \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.329122 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frlvh\" (UniqueName: \"kubernetes.io/projected/c8b52544-e7f5-4cab-9b11-1bf028d07c61-kube-api-access-frlvh\") pod \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\" (UID: \"c8b52544-e7f5-4cab-9b11-1bf028d07c61\") " Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.334903 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b52544-e7f5-4cab-9b11-1bf028d07c61-kube-api-access-frlvh" (OuterVolumeSpecName: "kube-api-access-frlvh") pod "c8b52544-e7f5-4cab-9b11-1bf028d07c61" (UID: "c8b52544-e7f5-4cab-9b11-1bf028d07c61"). InnerVolumeSpecName "kube-api-access-frlvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.356940 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-inventory" (OuterVolumeSpecName: "inventory") pod "c8b52544-e7f5-4cab-9b11-1bf028d07c61" (UID: "c8b52544-e7f5-4cab-9b11-1bf028d07c61"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.380123 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c8b52544-e7f5-4cab-9b11-1bf028d07c61" (UID: "c8b52544-e7f5-4cab-9b11-1bf028d07c61"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.431652 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.431707 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c8b52544-e7f5-4cab-9b11-1bf028d07c61-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.431728 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frlvh\" (UniqueName: \"kubernetes.io/projected/c8b52544-e7f5-4cab-9b11-1bf028d07c61-kube-api-access-frlvh\") on node \"crc\" DevicePath \"\"" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.727435 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" event={"ID":"c8b52544-e7f5-4cab-9b11-1bf028d07c61","Type":"ContainerDied","Data":"e4eb8bbeba9a13d290240bb4a1d7c53232a1265fddab7680b1f397b892c42724"} Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.727541 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-jw9jh" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.727691 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4eb8bbeba9a13d290240bb4a1d7c53232a1265fddab7680b1f397b892c42724" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.843179 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8"] Oct 06 12:38:31 crc kubenswrapper[4892]: E1006 12:38:31.843806 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerName="extract-content" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.843837 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerName="extract-content" Oct 06 12:38:31 crc kubenswrapper[4892]: E1006 12:38:31.843868 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerName="extract-utilities" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.843882 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerName="extract-utilities" Oct 06 12:38:31 crc kubenswrapper[4892]: E1006 12:38:31.843919 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b52544-e7f5-4cab-9b11-1bf028d07c61" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.843937 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b52544-e7f5-4cab-9b11-1bf028d07c61" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:38:31 crc kubenswrapper[4892]: E1006 12:38:31.843977 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerName="registry-server" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.843993 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerName="registry-server" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.844519 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b52544-e7f5-4cab-9b11-1bf028d07c61" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.844555 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd04082b-0289-4c3b-9f8d-4c5ac4707163" containerName="registry-server" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.845740 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.848470 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.848598 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.848639 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.848887 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.861020 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8"] Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.944913 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.945138 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh5zk\" (UniqueName: \"kubernetes.io/projected/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-kube-api-access-lh5zk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:31 crc kubenswrapper[4892]: I1006 12:38:31.945187 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:32 crc kubenswrapper[4892]: I1006 12:38:32.047184 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh5zk\" (UniqueName: \"kubernetes.io/projected/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-kube-api-access-lh5zk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:32 crc kubenswrapper[4892]: I1006 12:38:32.047285 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:32 crc kubenswrapper[4892]: I1006 12:38:32.047366 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:32 crc kubenswrapper[4892]: I1006 12:38:32.053017 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:32 crc kubenswrapper[4892]: I1006 12:38:32.053591 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:32 crc kubenswrapper[4892]: I1006 12:38:32.066944 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh5zk\" (UniqueName: \"kubernetes.io/projected/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-kube-api-access-lh5zk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:32 crc kubenswrapper[4892]: I1006 12:38:32.188551 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:38:32 crc kubenswrapper[4892]: I1006 12:38:32.800030 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8"] Oct 06 12:38:33 crc kubenswrapper[4892]: I1006 12:38:33.757168 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" event={"ID":"6665486e-c1dd-4b2d-96d8-5dd9140dc21e","Type":"ContainerStarted","Data":"f8aa8c209f091e7620d703a1ebc61240bb05a63a4c6f29e8fe003110d1825a13"} Oct 06 12:38:33 crc kubenswrapper[4892]: I1006 12:38:33.758044 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" event={"ID":"6665486e-c1dd-4b2d-96d8-5dd9140dc21e","Type":"ContainerStarted","Data":"dd37a9cebb42e50edfa701c34c908c7f074c653a4a1d8e912596a22fc6d1a10b"} Oct 06 12:38:33 crc kubenswrapper[4892]: I1006 12:38:33.782944 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" podStartSLOduration=2.247549711 podStartE2EDuration="2.78292745s" podCreationTimestamp="2025-10-06 12:38:31 +0000 UTC" firstStartedPulling="2025-10-06 12:38:32.808807214 +0000 UTC m=+1799.358512989" lastFinishedPulling="2025-10-06 12:38:33.344184963 +0000 UTC m=+1799.893890728" observedRunningTime="2025-10-06 12:38:33.774689691 +0000 UTC m=+1800.324395466" watchObservedRunningTime="2025-10-06 12:38:33.78292745 +0000 UTC m=+1800.332633205" Oct 06 12:38:34 crc kubenswrapper[4892]: I1006 12:38:34.182015 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:38:34 crc kubenswrapper[4892]: E1006 12:38:34.182515 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:38:35 crc kubenswrapper[4892]: I1006 12:38:35.073365 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5jh8x"] Oct 06 12:38:35 crc kubenswrapper[4892]: I1006 12:38:35.086984 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5jh8x"] Oct 06 12:38:36 crc kubenswrapper[4892]: I1006 12:38:36.190237 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c8ba56-1bfe-4b22-bc88-7d25ac9113d0" path="/var/lib/kubelet/pods/49c8ba56-1bfe-4b22-bc88-7d25ac9113d0/volumes" Oct 06 12:38:41 crc kubenswrapper[4892]: I1006 12:38:41.038941 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rpnxm"] Oct 06 12:38:41 crc kubenswrapper[4892]: I1006 12:38:41.050343 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rpnxm"] Oct 06 12:38:42 crc kubenswrapper[4892]: I1006 12:38:42.181598 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905c9ab8-2e12-4c06-9a9f-890faab36198" path="/var/lib/kubelet/pods/905c9ab8-2e12-4c06-9a9f-890faab36198/volumes" Oct 06 12:38:49 crc kubenswrapper[4892]: I1006 12:38:49.169508 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:38:49 crc kubenswrapper[4892]: E1006 12:38:49.170351 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:39:01 crc kubenswrapper[4892]: I1006 12:39:01.754434 4892 scope.go:117] "RemoveContainer" containerID="40cb54a72745796db114d6a68e744b96d70ceaf7170abf1fa781b59cf666a7da" Oct 06 12:39:01 crc kubenswrapper[4892]: I1006 12:39:01.820110 4892 scope.go:117] "RemoveContainer" containerID="26250415420fe80d7f2ca52bdfdda08ae762ecbaf0a1346ee402bb75255619bf" Oct 06 12:39:01 crc kubenswrapper[4892]: I1006 12:39:01.900692 4892 scope.go:117] "RemoveContainer" containerID="018459029c3e97dc59ff073e0b9f90cc653ff818e14cfcc4ba96541ca86fb3f2" Oct 06 12:39:02 crc kubenswrapper[4892]: I1006 12:39:02.170072 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:39:02 crc kubenswrapper[4892]: E1006 12:39:02.170504 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:39:16 crc kubenswrapper[4892]: I1006 12:39:16.168785 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:39:16 crc kubenswrapper[4892]: E1006 12:39:16.169533 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:39:19 crc kubenswrapper[4892]: I1006 12:39:19.077104 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-m6vm8"] Oct 06 12:39:19 crc kubenswrapper[4892]: I1006 12:39:19.088156 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-m6vm8"] Oct 06 12:39:20 crc kubenswrapper[4892]: I1006 12:39:20.184120 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f88b14-cdad-4ccd-865b-6f57c82a1a8a" path="/var/lib/kubelet/pods/c0f88b14-cdad-4ccd-865b-6f57c82a1a8a/volumes" Oct 06 12:39:28 crc kubenswrapper[4892]: I1006 12:39:28.169864 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:39:28 crc kubenswrapper[4892]: E1006 12:39:28.170955 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:39:33 crc kubenswrapper[4892]: I1006 12:39:33.472967 4892 generic.go:334] "Generic (PLEG): container finished" podID="6665486e-c1dd-4b2d-96d8-5dd9140dc21e" containerID="f8aa8c209f091e7620d703a1ebc61240bb05a63a4c6f29e8fe003110d1825a13" exitCode=2 Oct 06 12:39:33 crc kubenswrapper[4892]: I1006 12:39:33.473193 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" event={"ID":"6665486e-c1dd-4b2d-96d8-5dd9140dc21e","Type":"ContainerDied","Data":"f8aa8c209f091e7620d703a1ebc61240bb05a63a4c6f29e8fe003110d1825a13"} Oct 06 12:39:34 crc kubenswrapper[4892]: I1006 12:39:34.964215 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.097892 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh5zk\" (UniqueName: \"kubernetes.io/projected/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-kube-api-access-lh5zk\") pod \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.097973 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-ssh-key\") pod \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.098120 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-inventory\") pod \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\" (UID: \"6665486e-c1dd-4b2d-96d8-5dd9140dc21e\") " Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.109681 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-kube-api-access-lh5zk" (OuterVolumeSpecName: "kube-api-access-lh5zk") pod "6665486e-c1dd-4b2d-96d8-5dd9140dc21e" (UID: "6665486e-c1dd-4b2d-96d8-5dd9140dc21e"). InnerVolumeSpecName "kube-api-access-lh5zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.148845 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6665486e-c1dd-4b2d-96d8-5dd9140dc21e" (UID: "6665486e-c1dd-4b2d-96d8-5dd9140dc21e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.150103 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-inventory" (OuterVolumeSpecName: "inventory") pod "6665486e-c1dd-4b2d-96d8-5dd9140dc21e" (UID: "6665486e-c1dd-4b2d-96d8-5dd9140dc21e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.200619 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.200657 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh5zk\" (UniqueName: \"kubernetes.io/projected/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-kube-api-access-lh5zk\") on node \"crc\" DevicePath \"\"" Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.200671 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6665486e-c1dd-4b2d-96d8-5dd9140dc21e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.512935 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" event={"ID":"6665486e-c1dd-4b2d-96d8-5dd9140dc21e","Type":"ContainerDied","Data":"dd37a9cebb42e50edfa701c34c908c7f074c653a4a1d8e912596a22fc6d1a10b"} Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.512983 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8" Oct 06 12:39:35 crc kubenswrapper[4892]: I1006 12:39:35.512991 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd37a9cebb42e50edfa701c34c908c7f074c653a4a1d8e912596a22fc6d1a10b" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.044479 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq"] Oct 06 12:39:42 crc kubenswrapper[4892]: E1006 12:39:42.046050 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6665486e-c1dd-4b2d-96d8-5dd9140dc21e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.046087 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6665486e-c1dd-4b2d-96d8-5dd9140dc21e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.046596 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6665486e-c1dd-4b2d-96d8-5dd9140dc21e" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.048194 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.051870 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.052117 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.052368 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.052422 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.061746 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq"] Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.162434 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hb5\" (UniqueName: \"kubernetes.io/projected/57b29956-a9eb-4ca8-b130-a67dfdf2190b-kube-api-access-m2hb5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.162522 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.162704 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.265648 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.265721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hb5\" (UniqueName: \"kubernetes.io/projected/57b29956-a9eb-4ca8-b130-a67dfdf2190b-kube-api-access-m2hb5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.265837 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.277445 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.277462 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.292796 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hb5\" (UniqueName: \"kubernetes.io/projected/57b29956-a9eb-4ca8-b130-a67dfdf2190b-kube-api-access-m2hb5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kstlq\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:42 crc kubenswrapper[4892]: I1006 12:39:42.384462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:39:43 crc kubenswrapper[4892]: I1006 12:39:43.043068 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq"] Oct 06 12:39:43 crc kubenswrapper[4892]: W1006 12:39:43.052525 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57b29956_a9eb_4ca8_b130_a67dfdf2190b.slice/crio-2212a70aba0de070a72bf2012b214c7080a36b0edf86faea7823f461adfb2d97 WatchSource:0}: Error finding container 2212a70aba0de070a72bf2012b214c7080a36b0edf86faea7823f461adfb2d97: Status 404 returned error can't find the container with id 2212a70aba0de070a72bf2012b214c7080a36b0edf86faea7823f461adfb2d97 Oct 06 12:39:43 crc kubenswrapper[4892]: I1006 12:39:43.057267 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 12:39:43 crc kubenswrapper[4892]: I1006 12:39:43.168647 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:39:43 crc kubenswrapper[4892]: E1006 12:39:43.168967 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:39:43 crc kubenswrapper[4892]: I1006 12:39:43.623470 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" event={"ID":"57b29956-a9eb-4ca8-b130-a67dfdf2190b","Type":"ContainerStarted","Data":"2212a70aba0de070a72bf2012b214c7080a36b0edf86faea7823f461adfb2d97"} Oct 06 12:39:44 crc kubenswrapper[4892]: I1006 12:39:44.640362 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" event={"ID":"57b29956-a9eb-4ca8-b130-a67dfdf2190b","Type":"ContainerStarted","Data":"45580ed53d2af2c71347f5a8388aa179b39a98331dc997f9f29e4c801de8f16a"} Oct 06 12:39:44 crc kubenswrapper[4892]: I1006 12:39:44.679910 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" podStartSLOduration=2.025163636 podStartE2EDuration="2.679880875s" podCreationTimestamp="2025-10-06 12:39:42 +0000 UTC" firstStartedPulling="2025-10-06 12:39:43.05689582 +0000 UTC m=+1869.606601585" lastFinishedPulling="2025-10-06 12:39:43.711613029 +0000 UTC m=+1870.261318824" observedRunningTime="2025-10-06 12:39:44.661363808 +0000 UTC m=+1871.211069603" watchObservedRunningTime="2025-10-06 12:39:44.679880875 +0000 UTC m=+1871.229586670" Oct 06 12:39:58 crc kubenswrapper[4892]: I1006 12:39:58.174703 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:39:58 crc kubenswrapper[4892]: E1006 12:39:58.175352 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:40:02 crc kubenswrapper[4892]: I1006 12:40:02.045812 4892 scope.go:117] "RemoveContainer" containerID="204462ecb6e2fd9f11f1f836a1565b41dc9cba1a625f7c36cea3959dde9a8780" Oct 06 12:40:11 crc kubenswrapper[4892]: I1006 12:40:11.168857 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:40:11 crc kubenswrapper[4892]: E1006 12:40:11.170007 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:40:25 crc kubenswrapper[4892]: I1006 12:40:25.168656 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:40:25 crc kubenswrapper[4892]: E1006 12:40:25.171104 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:40:38 crc kubenswrapper[4892]: I1006 12:40:38.230438 4892 generic.go:334] "Generic (PLEG): container finished" podID="57b29956-a9eb-4ca8-b130-a67dfdf2190b" containerID="45580ed53d2af2c71347f5a8388aa179b39a98331dc997f9f29e4c801de8f16a" exitCode=0 Oct 06 12:40:38 crc kubenswrapper[4892]: I1006 12:40:38.230484 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" event={"ID":"57b29956-a9eb-4ca8-b130-a67dfdf2190b","Type":"ContainerDied","Data":"45580ed53d2af2c71347f5a8388aa179b39a98331dc997f9f29e4c801de8f16a"} Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.168269 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:40:39 crc kubenswrapper[4892]: E1006 12:40:39.168643 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.732879 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.882543 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-ssh-key\") pod \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.882883 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2hb5\" (UniqueName: \"kubernetes.io/projected/57b29956-a9eb-4ca8-b130-a67dfdf2190b-kube-api-access-m2hb5\") pod \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.883019 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-inventory\") pod \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\" (UID: \"57b29956-a9eb-4ca8-b130-a67dfdf2190b\") " Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.893702 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b29956-a9eb-4ca8-b130-a67dfdf2190b-kube-api-access-m2hb5" (OuterVolumeSpecName: "kube-api-access-m2hb5") pod "57b29956-a9eb-4ca8-b130-a67dfdf2190b" (UID: "57b29956-a9eb-4ca8-b130-a67dfdf2190b"). InnerVolumeSpecName "kube-api-access-m2hb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.915833 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "57b29956-a9eb-4ca8-b130-a67dfdf2190b" (UID: "57b29956-a9eb-4ca8-b130-a67dfdf2190b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.933291 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-inventory" (OuterVolumeSpecName: "inventory") pod "57b29956-a9eb-4ca8-b130-a67dfdf2190b" (UID: "57b29956-a9eb-4ca8-b130-a67dfdf2190b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.985124 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.985160 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/57b29956-a9eb-4ca8-b130-a67dfdf2190b-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:40:39 crc kubenswrapper[4892]: I1006 12:40:39.985172 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2hb5\" (UniqueName: \"kubernetes.io/projected/57b29956-a9eb-4ca8-b130-a67dfdf2190b-kube-api-access-m2hb5\") on node \"crc\" DevicePath \"\"" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.271211 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" event={"ID":"57b29956-a9eb-4ca8-b130-a67dfdf2190b","Type":"ContainerDied","Data":"2212a70aba0de070a72bf2012b214c7080a36b0edf86faea7823f461adfb2d97"} Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.271264 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2212a70aba0de070a72bf2012b214c7080a36b0edf86faea7823f461adfb2d97" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.271356 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kstlq" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.362282 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b7wmh"] Oct 06 12:40:40 crc kubenswrapper[4892]: E1006 12:40:40.362741 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57b29956-a9eb-4ca8-b130-a67dfdf2190b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.362766 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b29956-a9eb-4ca8-b130-a67dfdf2190b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.363046 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="57b29956-a9eb-4ca8-b130-a67dfdf2190b" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.363897 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.366523 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.366598 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.366970 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.369718 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.372054 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b7wmh"] Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.497534 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkxbn\" (UniqueName: \"kubernetes.io/projected/cac6d064-9f38-40e2-aa4a-ad2af08245c3-kube-api-access-qkxbn\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.497612 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.497695 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.599536 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkxbn\" (UniqueName: \"kubernetes.io/projected/cac6d064-9f38-40e2-aa4a-ad2af08245c3-kube-api-access-qkxbn\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.599608 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.599675 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.603798 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.605229 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.615805 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkxbn\" (UniqueName: \"kubernetes.io/projected/cac6d064-9f38-40e2-aa4a-ad2af08245c3-kube-api-access-qkxbn\") pod \"ssh-known-hosts-edpm-deployment-b7wmh\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:40 crc kubenswrapper[4892]: I1006 12:40:40.684413 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:41 crc kubenswrapper[4892]: I1006 12:40:41.221882 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-b7wmh"] Oct 06 12:40:41 crc kubenswrapper[4892]: W1006 12:40:41.225698 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac6d064_9f38_40e2_aa4a_ad2af08245c3.slice/crio-b912f58f2d9377002fabb635d0b68ee209b856177e06a743023532dd58f5911a WatchSource:0}: Error finding container b912f58f2d9377002fabb635d0b68ee209b856177e06a743023532dd58f5911a: Status 404 returned error can't find the container with id b912f58f2d9377002fabb635d0b68ee209b856177e06a743023532dd58f5911a Oct 06 12:40:41 crc kubenswrapper[4892]: I1006 12:40:41.279989 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" event={"ID":"cac6d064-9f38-40e2-aa4a-ad2af08245c3","Type":"ContainerStarted","Data":"b912f58f2d9377002fabb635d0b68ee209b856177e06a743023532dd58f5911a"} Oct 06 12:40:42 crc kubenswrapper[4892]: I1006 12:40:42.292898 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" event={"ID":"cac6d064-9f38-40e2-aa4a-ad2af08245c3","Type":"ContainerStarted","Data":"cf2447f8db77c2a937c7727db3c5cc28f0a6767253985bd6d77faca0471b6bba"} Oct 06 12:40:42 crc kubenswrapper[4892]: I1006 12:40:42.336380 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" podStartSLOduration=1.680869504 podStartE2EDuration="2.336317642s" podCreationTimestamp="2025-10-06 12:40:40 +0000 UTC" firstStartedPulling="2025-10-06 12:40:41.227314988 +0000 UTC m=+1927.777020753" lastFinishedPulling="2025-10-06 12:40:41.882763096 +0000 UTC m=+1928.432468891" observedRunningTime="2025-10-06 12:40:42.317743245 +0000 UTC m=+1928.867449030" watchObservedRunningTime="2025-10-06 12:40:42.336317642 +0000 UTC m=+1928.886023437" Oct 06 12:40:49 crc kubenswrapper[4892]: E1006 12:40:49.731954 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcac6d064_9f38_40e2_aa4a_ad2af08245c3.slice/crio-conmon-cf2447f8db77c2a937c7727db3c5cc28f0a6767253985bd6d77faca0471b6bba.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:40:50 crc kubenswrapper[4892]: I1006 12:40:50.391888 4892 generic.go:334] "Generic (PLEG): container finished" podID="cac6d064-9f38-40e2-aa4a-ad2af08245c3" containerID="cf2447f8db77c2a937c7727db3c5cc28f0a6767253985bd6d77faca0471b6bba" exitCode=0 Oct 06 12:40:50 crc kubenswrapper[4892]: I1006 12:40:50.391961 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" event={"ID":"cac6d064-9f38-40e2-aa4a-ad2af08245c3","Type":"ContainerDied","Data":"cf2447f8db77c2a937c7727db3c5cc28f0a6767253985bd6d77faca0471b6bba"} Oct 06 12:40:51 crc kubenswrapper[4892]: I1006 12:40:51.169120 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:40:51 crc kubenswrapper[4892]: E1006 12:40:51.169534 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:40:51 crc kubenswrapper[4892]: I1006 12:40:51.892606 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:51 crc kubenswrapper[4892]: I1006 12:40:51.964123 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-ssh-key-openstack-edpm-ipam\") pod \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " Oct 06 12:40:51 crc kubenswrapper[4892]: I1006 12:40:51.964193 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-inventory-0\") pod \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " Oct 06 12:40:51 crc kubenswrapper[4892]: I1006 12:40:51.964362 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkxbn\" (UniqueName: \"kubernetes.io/projected/cac6d064-9f38-40e2-aa4a-ad2af08245c3-kube-api-access-qkxbn\") pod \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\" (UID: \"cac6d064-9f38-40e2-aa4a-ad2af08245c3\") " Oct 06 12:40:51 crc kubenswrapper[4892]: I1006 12:40:51.970502 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac6d064-9f38-40e2-aa4a-ad2af08245c3-kube-api-access-qkxbn" (OuterVolumeSpecName: "kube-api-access-qkxbn") pod "cac6d064-9f38-40e2-aa4a-ad2af08245c3" (UID: "cac6d064-9f38-40e2-aa4a-ad2af08245c3"). InnerVolumeSpecName "kube-api-access-qkxbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:40:51 crc kubenswrapper[4892]: I1006 12:40:51.996227 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "cac6d064-9f38-40e2-aa4a-ad2af08245c3" (UID: "cac6d064-9f38-40e2-aa4a-ad2af08245c3"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.004927 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cac6d064-9f38-40e2-aa4a-ad2af08245c3" (UID: "cac6d064-9f38-40e2-aa4a-ad2af08245c3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.067620 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.067654 4892 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/cac6d064-9f38-40e2-aa4a-ad2af08245c3-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.067668 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkxbn\" (UniqueName: \"kubernetes.io/projected/cac6d064-9f38-40e2-aa4a-ad2af08245c3-kube-api-access-qkxbn\") on node \"crc\" DevicePath \"\"" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.416838 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" event={"ID":"cac6d064-9f38-40e2-aa4a-ad2af08245c3","Type":"ContainerDied","Data":"b912f58f2d9377002fabb635d0b68ee209b856177e06a743023532dd58f5911a"} Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.416904 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b912f58f2d9377002fabb635d0b68ee209b856177e06a743023532dd58f5911a" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.416946 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-b7wmh" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.523267 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8"] Oct 06 12:40:52 crc kubenswrapper[4892]: E1006 12:40:52.523734 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac6d064-9f38-40e2-aa4a-ad2af08245c3" containerName="ssh-known-hosts-edpm-deployment" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.523751 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac6d064-9f38-40e2-aa4a-ad2af08245c3" containerName="ssh-known-hosts-edpm-deployment" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.523958 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac6d064-9f38-40e2-aa4a-ad2af08245c3" containerName="ssh-known-hosts-edpm-deployment" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.524755 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.527392 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.527847 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.527878 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.528219 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.553172 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8"] Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.680794 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.680975 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.681025 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pbvs\" (UniqueName: \"kubernetes.io/projected/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-kube-api-access-2pbvs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.782635 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.782732 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.782755 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pbvs\" (UniqueName: \"kubernetes.io/projected/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-kube-api-access-2pbvs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.786839 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.787912 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.807494 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pbvs\" (UniqueName: \"kubernetes.io/projected/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-kube-api-access-2pbvs\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fksz8\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:52 crc kubenswrapper[4892]: I1006 12:40:52.851054 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:40:53 crc kubenswrapper[4892]: I1006 12:40:53.419432 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8"] Oct 06 12:40:54 crc kubenswrapper[4892]: I1006 12:40:54.439619 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" event={"ID":"ecf9ad4c-db82-467f-9fae-bc74b2e7c912","Type":"ContainerStarted","Data":"3ade15c0a4ce62498f193a52a80f93662287a1236da5b5c6f688d2f034846c6b"} Oct 06 12:40:54 crc kubenswrapper[4892]: I1006 12:40:54.439938 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" event={"ID":"ecf9ad4c-db82-467f-9fae-bc74b2e7c912","Type":"ContainerStarted","Data":"14495ddc061c30edd5fe7c12033b28180e5a6838444617fa93359ecd8f477f3b"} Oct 06 12:40:54 crc kubenswrapper[4892]: I1006 12:40:54.465469 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" podStartSLOduration=2.021397191 podStartE2EDuration="2.465446702s" podCreationTimestamp="2025-10-06 12:40:52 +0000 UTC" firstStartedPulling="2025-10-06 12:40:53.433665448 +0000 UTC m=+1939.983371253" lastFinishedPulling="2025-10-06 12:40:53.877714959 +0000 UTC m=+1940.427420764" observedRunningTime="2025-10-06 12:40:54.461998952 +0000 UTC m=+1941.011704717" watchObservedRunningTime="2025-10-06 12:40:54.465446702 +0000 UTC m=+1941.015152467" Oct 06 12:41:03 crc kubenswrapper[4892]: I1006 12:41:03.554975 4892 generic.go:334] "Generic (PLEG): container finished" podID="ecf9ad4c-db82-467f-9fae-bc74b2e7c912" containerID="3ade15c0a4ce62498f193a52a80f93662287a1236da5b5c6f688d2f034846c6b" exitCode=0 Oct 06 12:41:03 crc kubenswrapper[4892]: I1006 12:41:03.555093 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" event={"ID":"ecf9ad4c-db82-467f-9fae-bc74b2e7c912","Type":"ContainerDied","Data":"3ade15c0a4ce62498f193a52a80f93662287a1236da5b5c6f688d2f034846c6b"} Oct 06 12:41:04 crc kubenswrapper[4892]: I1006 12:41:04.999314 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.151463 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pbvs\" (UniqueName: \"kubernetes.io/projected/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-kube-api-access-2pbvs\") pod \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.151645 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-ssh-key\") pod \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.151716 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-inventory\") pod \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\" (UID: \"ecf9ad4c-db82-467f-9fae-bc74b2e7c912\") " Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.168498 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-kube-api-access-2pbvs" (OuterVolumeSpecName: "kube-api-access-2pbvs") pod "ecf9ad4c-db82-467f-9fae-bc74b2e7c912" (UID: "ecf9ad4c-db82-467f-9fae-bc74b2e7c912"). InnerVolumeSpecName "kube-api-access-2pbvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.170297 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:41:05 crc kubenswrapper[4892]: E1006 12:41:05.170668 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.194984 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-inventory" (OuterVolumeSpecName: "inventory") pod "ecf9ad4c-db82-467f-9fae-bc74b2e7c912" (UID: "ecf9ad4c-db82-467f-9fae-bc74b2e7c912"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.202513 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ecf9ad4c-db82-467f-9fae-bc74b2e7c912" (UID: "ecf9ad4c-db82-467f-9fae-bc74b2e7c912"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.254026 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pbvs\" (UniqueName: \"kubernetes.io/projected/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-kube-api-access-2pbvs\") on node \"crc\" DevicePath \"\"" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.254071 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.254088 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecf9ad4c-db82-467f-9fae-bc74b2e7c912-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.588563 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" event={"ID":"ecf9ad4c-db82-467f-9fae-bc74b2e7c912","Type":"ContainerDied","Data":"14495ddc061c30edd5fe7c12033b28180e5a6838444617fa93359ecd8f477f3b"} Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.588617 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14495ddc061c30edd5fe7c12033b28180e5a6838444617fa93359ecd8f477f3b" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.588782 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fksz8" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.661665 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c"] Oct 06 12:41:05 crc kubenswrapper[4892]: E1006 12:41:05.662227 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf9ad4c-db82-467f-9fae-bc74b2e7c912" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.662258 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf9ad4c-db82-467f-9fae-bc74b2e7c912" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.662678 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf9ad4c-db82-467f-9fae-bc74b2e7c912" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.663669 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.666458 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.666460 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.667431 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.669774 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.675431 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c"] Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.766229 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcrqh\" (UniqueName: \"kubernetes.io/projected/7ae373d0-3872-4bdb-ab95-af7aba741dbc-kube-api-access-mcrqh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.766276 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.766459 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.867925 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.868335 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcrqh\" (UniqueName: \"kubernetes.io/projected/7ae373d0-3872-4bdb-ab95-af7aba741dbc-kube-api-access-mcrqh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.868361 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.872898 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.886868 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:05 crc kubenswrapper[4892]: I1006 12:41:05.896360 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcrqh\" (UniqueName: \"kubernetes.io/projected/7ae373d0-3872-4bdb-ab95-af7aba741dbc-kube-api-access-mcrqh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:06 crc kubenswrapper[4892]: I1006 12:41:06.026532 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:06 crc kubenswrapper[4892]: I1006 12:41:06.546675 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c"] Oct 06 12:41:06 crc kubenswrapper[4892]: I1006 12:41:06.600966 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" event={"ID":"7ae373d0-3872-4bdb-ab95-af7aba741dbc","Type":"ContainerStarted","Data":"3a99b567043249057aeaeb135870b6917ab6de09662964b48fa3fb2373f45acf"} Oct 06 12:41:07 crc kubenswrapper[4892]: I1006 12:41:07.632673 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" event={"ID":"7ae373d0-3872-4bdb-ab95-af7aba741dbc","Type":"ContainerStarted","Data":"7d090dd8d95c54e68e5bcfd0213ca94d2240200361adbb7732971e74d2f679be"} Oct 06 12:41:07 crc kubenswrapper[4892]: I1006 12:41:07.668172 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" podStartSLOduration=2.165189139 podStartE2EDuration="2.668144681s" podCreationTimestamp="2025-10-06 12:41:05 +0000 UTC" firstStartedPulling="2025-10-06 12:41:06.561157545 +0000 UTC m=+1953.110863320" lastFinishedPulling="2025-10-06 12:41:07.064113097 +0000 UTC m=+1953.613818862" observedRunningTime="2025-10-06 12:41:07.650887362 +0000 UTC m=+1954.200593167" watchObservedRunningTime="2025-10-06 12:41:07.668144681 +0000 UTC m=+1954.217850486" Oct 06 12:41:17 crc kubenswrapper[4892]: I1006 12:41:17.740447 4892 generic.go:334] "Generic (PLEG): container finished" podID="7ae373d0-3872-4bdb-ab95-af7aba741dbc" containerID="7d090dd8d95c54e68e5bcfd0213ca94d2240200361adbb7732971e74d2f679be" exitCode=0 Oct 06 12:41:17 crc kubenswrapper[4892]: I1006 12:41:17.740611 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" event={"ID":"7ae373d0-3872-4bdb-ab95-af7aba741dbc","Type":"ContainerDied","Data":"7d090dd8d95c54e68e5bcfd0213ca94d2240200361adbb7732971e74d2f679be"} Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.259395 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.440530 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcrqh\" (UniqueName: \"kubernetes.io/projected/7ae373d0-3872-4bdb-ab95-af7aba741dbc-kube-api-access-mcrqh\") pod \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.441429 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-ssh-key\") pod \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.442009 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-inventory\") pod \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\" (UID: \"7ae373d0-3872-4bdb-ab95-af7aba741dbc\") " Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.446926 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae373d0-3872-4bdb-ab95-af7aba741dbc-kube-api-access-mcrqh" (OuterVolumeSpecName: "kube-api-access-mcrqh") pod "7ae373d0-3872-4bdb-ab95-af7aba741dbc" (UID: "7ae373d0-3872-4bdb-ab95-af7aba741dbc"). InnerVolumeSpecName "kube-api-access-mcrqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.467904 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-inventory" (OuterVolumeSpecName: "inventory") pod "7ae373d0-3872-4bdb-ab95-af7aba741dbc" (UID: "7ae373d0-3872-4bdb-ab95-af7aba741dbc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.469436 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7ae373d0-3872-4bdb-ab95-af7aba741dbc" (UID: "7ae373d0-3872-4bdb-ab95-af7aba741dbc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.544817 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcrqh\" (UniqueName: \"kubernetes.io/projected/7ae373d0-3872-4bdb-ab95-af7aba741dbc-kube-api-access-mcrqh\") on node \"crc\" DevicePath \"\"" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.544866 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.544884 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ae373d0-3872-4bdb-ab95-af7aba741dbc-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.773304 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" event={"ID":"7ae373d0-3872-4bdb-ab95-af7aba741dbc","Type":"ContainerDied","Data":"3a99b567043249057aeaeb135870b6917ab6de09662964b48fa3fb2373f45acf"} Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.773369 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a99b567043249057aeaeb135870b6917ab6de09662964b48fa3fb2373f45acf" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.773864 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.874558 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp"] Oct 06 12:41:19 crc kubenswrapper[4892]: E1006 12:41:19.875033 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae373d0-3872-4bdb-ab95-af7aba741dbc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.875061 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae373d0-3872-4bdb-ab95-af7aba741dbc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.875361 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae373d0-3872-4bdb-ab95-af7aba741dbc" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.876568 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.879703 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.879866 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.880239 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.880663 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.886459 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.887595 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.887841 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.888018 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 06 12:41:19 crc kubenswrapper[4892]: I1006 12:41:19.903207 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp"] Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.054657 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.054733 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055348 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055411 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055522 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055552 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055591 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055621 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7tjz\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-kube-api-access-t7tjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055723 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055794 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055877 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.055924 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.056021 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.056139 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157682 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157763 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157787 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157817 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157840 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157877 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157905 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7tjz\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-kube-api-access-t7tjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157935 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157962 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.157997 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.158016 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.158049 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.158085 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.158119 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.163267 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.163975 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.164420 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.164647 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.164803 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.165401 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.165987 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.166030 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.166261 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.166488 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.167155 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.167433 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.168727 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:41:20 crc kubenswrapper[4892]: E1006 12:41:20.169173 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.171248 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.180840 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7tjz\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-kube-api-access-t7tjz\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-zzphp\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.201175 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.547006 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp"] Oct 06 12:41:20 crc kubenswrapper[4892]: I1006 12:41:20.786307 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" event={"ID":"68286c2d-3cd6-43fd-943d-b2156e1253a5","Type":"ContainerStarted","Data":"78efcae317cf1f89f14bb36eddd7b35f33ac9efff54942f75d51336cea3fe01a"} Oct 06 12:41:21 crc kubenswrapper[4892]: I1006 12:41:21.799151 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" event={"ID":"68286c2d-3cd6-43fd-943d-b2156e1253a5","Type":"ContainerStarted","Data":"6394f9d789abb83dd975be121e70b5ba8b8db290882dd5f3d2a074be7d895466"} Oct 06 12:41:21 crc kubenswrapper[4892]: I1006 12:41:21.832253 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" podStartSLOduration=2.357747231 podStartE2EDuration="2.83223688s" podCreationTimestamp="2025-10-06 12:41:19 +0000 UTC" firstStartedPulling="2025-10-06 12:41:20.573006276 +0000 UTC m=+1967.122712041" lastFinishedPulling="2025-10-06 12:41:21.047495915 +0000 UTC m=+1967.597201690" observedRunningTime="2025-10-06 12:41:21.825826285 +0000 UTC m=+1968.375532040" watchObservedRunningTime="2025-10-06 12:41:21.83223688 +0000 UTC m=+1968.381942645" Oct 06 12:41:33 crc kubenswrapper[4892]: I1006 12:41:33.169188 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:41:33 crc kubenswrapper[4892]: E1006 12:41:33.170456 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:41:47 crc kubenswrapper[4892]: I1006 12:41:47.171261 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:41:47 crc kubenswrapper[4892]: E1006 12:41:47.172675 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:42:01 crc kubenswrapper[4892]: I1006 12:42:01.169210 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:42:02 crc kubenswrapper[4892]: I1006 12:42:02.285088 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"aaf00a7af53f2aef2fe68e84f3f54bc362b13cdfe9c13a663cf8163c9b021ad5"} Oct 06 12:42:05 crc kubenswrapper[4892]: I1006 12:42:05.314151 4892 generic.go:334] "Generic (PLEG): container finished" podID="68286c2d-3cd6-43fd-943d-b2156e1253a5" containerID="6394f9d789abb83dd975be121e70b5ba8b8db290882dd5f3d2a074be7d895466" exitCode=0 Oct 06 12:42:05 crc kubenswrapper[4892]: I1006 12:42:05.314230 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" event={"ID":"68286c2d-3cd6-43fd-943d-b2156e1253a5","Type":"ContainerDied","Data":"6394f9d789abb83dd975be121e70b5ba8b8db290882dd5f3d2a074be7d895466"} Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.815636 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890594 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-telemetry-combined-ca-bundle\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890651 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-bootstrap-combined-ca-bundle\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890689 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-repo-setup-combined-ca-bundle\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890728 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-neutron-metadata-combined-ca-bundle\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890803 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890819 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-nova-combined-ca-bundle\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890859 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890884 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-inventory\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890905 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-libvirt-combined-ca-bundle\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890951 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ovn-combined-ca-bundle\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.890988 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.891011 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ssh-key\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.891033 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7tjz\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-kube-api-access-t7tjz\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.891072 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"68286c2d-3cd6-43fd-943d-b2156e1253a5\" (UID: \"68286c2d-3cd6-43fd-943d-b2156e1253a5\") " Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.897345 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.897396 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.897503 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.898332 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.899161 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.899436 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.900067 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.909581 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-kube-api-access-t7tjz" (OuterVolumeSpecName: "kube-api-access-t7tjz") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "kube-api-access-t7tjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.909610 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.909668 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.909763 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.911538 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.924774 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-inventory" (OuterVolumeSpecName: "inventory") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.924787 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "68286c2d-3cd6-43fd-943d-b2156e1253a5" (UID: "68286c2d-3cd6-43fd-943d-b2156e1253a5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993200 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7tjz\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-kube-api-access-t7tjz\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993243 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993258 4892 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993271 4892 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993283 4892 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993294 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993308 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993337 4892 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993350 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993362 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993374 4892 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993388 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993400 4892 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/68286c2d-3cd6-43fd-943d-b2156e1253a5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:06 crc kubenswrapper[4892]: I1006 12:42:06.993412 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/68286c2d-3cd6-43fd-943d-b2156e1253a5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.340939 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" event={"ID":"68286c2d-3cd6-43fd-943d-b2156e1253a5","Type":"ContainerDied","Data":"78efcae317cf1f89f14bb36eddd7b35f33ac9efff54942f75d51336cea3fe01a"} Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.341298 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78efcae317cf1f89f14bb36eddd7b35f33ac9efff54942f75d51336cea3fe01a" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.341009 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-zzphp" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.456828 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bbhwk"] Oct 06 12:42:07 crc kubenswrapper[4892]: E1006 12:42:07.457190 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68286c2d-3cd6-43fd-943d-b2156e1253a5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.457207 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="68286c2d-3cd6-43fd-943d-b2156e1253a5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.457434 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="68286c2d-3cd6-43fd-943d-b2156e1253a5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.458966 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.483889 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bbhwk"] Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.505768 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzmw\" (UniqueName: \"kubernetes.io/projected/a21bb194-d03c-4eba-bf1f-ac216685a432-kube-api-access-pnzmw\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.505865 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-catalog-content\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.505909 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-utilities\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.563729 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq"] Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.565020 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.567737 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.567965 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.567998 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.568104 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.568177 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.575885 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq"] Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.608076 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.608116 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.608188 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.608300 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt9b2\" (UniqueName: \"kubernetes.io/projected/073e303e-602b-4cce-b0c5-f9da295a63a4-kube-api-access-rt9b2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.608352 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/073e303e-602b-4cce-b0c5-f9da295a63a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.608566 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzmw\" (UniqueName: \"kubernetes.io/projected/a21bb194-d03c-4eba-bf1f-ac216685a432-kube-api-access-pnzmw\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.608778 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-catalog-content\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.608866 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-utilities\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.609349 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-utilities\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.609313 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-catalog-content\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.630391 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzmw\" (UniqueName: \"kubernetes.io/projected/a21bb194-d03c-4eba-bf1f-ac216685a432-kube-api-access-pnzmw\") pod \"redhat-operators-bbhwk\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.710766 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.710809 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.710843 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.710911 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt9b2\" (UniqueName: \"kubernetes.io/projected/073e303e-602b-4cce-b0c5-f9da295a63a4-kube-api-access-rt9b2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.710949 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/073e303e-602b-4cce-b0c5-f9da295a63a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.716241 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/073e303e-602b-4cce-b0c5-f9da295a63a4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.718628 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.720195 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.735044 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.739121 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt9b2\" (UniqueName: \"kubernetes.io/projected/073e303e-602b-4cce-b0c5-f9da295a63a4-kube-api-access-rt9b2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-zwdjq\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.780079 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:07 crc kubenswrapper[4892]: I1006 12:42:07.896519 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:42:08 crc kubenswrapper[4892]: I1006 12:42:08.239047 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bbhwk"] Oct 06 12:42:08 crc kubenswrapper[4892]: I1006 12:42:08.365055 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbhwk" event={"ID":"a21bb194-d03c-4eba-bf1f-ac216685a432","Type":"ContainerStarted","Data":"2331830ff2ed8f008ddc52fa6e47d91fceacd7e21016f5e6e72ad93181d074e6"} Oct 06 12:42:08 crc kubenswrapper[4892]: I1006 12:42:08.485385 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq"] Oct 06 12:42:09 crc kubenswrapper[4892]: I1006 12:42:09.381680 4892 generic.go:334] "Generic (PLEG): container finished" podID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerID="c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef" exitCode=0 Oct 06 12:42:09 crc kubenswrapper[4892]: I1006 12:42:09.381848 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbhwk" event={"ID":"a21bb194-d03c-4eba-bf1f-ac216685a432","Type":"ContainerDied","Data":"c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef"} Oct 06 12:42:09 crc kubenswrapper[4892]: I1006 12:42:09.383818 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" event={"ID":"073e303e-602b-4cce-b0c5-f9da295a63a4","Type":"ContainerStarted","Data":"99bb33f9dccadf80261aaeb3f6f29dd6e8bd715b70a56e184b8510b26d28563b"} Oct 06 12:42:09 crc kubenswrapper[4892]: I1006 12:42:09.383858 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" event={"ID":"073e303e-602b-4cce-b0c5-f9da295a63a4","Type":"ContainerStarted","Data":"e15cae6a5403dbb71f6d56abe5e69e0fae624c34fca132b7313fe89982f27144"} Oct 06 12:42:09 crc kubenswrapper[4892]: I1006 12:42:09.420734 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" podStartSLOduration=2.006200338 podStartE2EDuration="2.420713605s" podCreationTimestamp="2025-10-06 12:42:07 +0000 UTC" firstStartedPulling="2025-10-06 12:42:08.540757219 +0000 UTC m=+2015.090462974" lastFinishedPulling="2025-10-06 12:42:08.955270486 +0000 UTC m=+2015.504976241" observedRunningTime="2025-10-06 12:42:09.410700096 +0000 UTC m=+2015.960405861" watchObservedRunningTime="2025-10-06 12:42:09.420713605 +0000 UTC m=+2015.970419380" Oct 06 12:42:11 crc kubenswrapper[4892]: I1006 12:42:11.411131 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbhwk" event={"ID":"a21bb194-d03c-4eba-bf1f-ac216685a432","Type":"ContainerStarted","Data":"7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b"} Oct 06 12:42:12 crc kubenswrapper[4892]: I1006 12:42:12.424491 4892 generic.go:334] "Generic (PLEG): container finished" podID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerID="7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b" exitCode=0 Oct 06 12:42:12 crc kubenswrapper[4892]: I1006 12:42:12.424614 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbhwk" event={"ID":"a21bb194-d03c-4eba-bf1f-ac216685a432","Type":"ContainerDied","Data":"7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b"} Oct 06 12:42:13 crc kubenswrapper[4892]: I1006 12:42:13.440900 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbhwk" event={"ID":"a21bb194-d03c-4eba-bf1f-ac216685a432","Type":"ContainerStarted","Data":"46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1"} Oct 06 12:42:13 crc kubenswrapper[4892]: I1006 12:42:13.464117 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bbhwk" podStartSLOduration=2.95283088 podStartE2EDuration="6.464098658s" podCreationTimestamp="2025-10-06 12:42:07 +0000 UTC" firstStartedPulling="2025-10-06 12:42:09.387051982 +0000 UTC m=+2015.936757767" lastFinishedPulling="2025-10-06 12:42:12.89831978 +0000 UTC m=+2019.448025545" observedRunningTime="2025-10-06 12:42:13.462751139 +0000 UTC m=+2020.012456914" watchObservedRunningTime="2025-10-06 12:42:13.464098658 +0000 UTC m=+2020.013804443" Oct 06 12:42:17 crc kubenswrapper[4892]: I1006 12:42:17.780980 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:17 crc kubenswrapper[4892]: I1006 12:42:17.781710 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:18 crc kubenswrapper[4892]: I1006 12:42:18.835975 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bbhwk" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="registry-server" probeResult="failure" output=< Oct 06 12:42:18 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Oct 06 12:42:18 crc kubenswrapper[4892]: > Oct 06 12:42:27 crc kubenswrapper[4892]: I1006 12:42:27.874599 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:27 crc kubenswrapper[4892]: I1006 12:42:27.963948 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:28 crc kubenswrapper[4892]: I1006 12:42:28.116212 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bbhwk"] Oct 06 12:42:29 crc kubenswrapper[4892]: I1006 12:42:29.639024 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bbhwk" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="registry-server" containerID="cri-o://46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1" gracePeriod=2 Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.146772 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.291374 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-utilities\") pod \"a21bb194-d03c-4eba-bf1f-ac216685a432\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.291621 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-catalog-content\") pod \"a21bb194-d03c-4eba-bf1f-ac216685a432\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.291887 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnzmw\" (UniqueName: \"kubernetes.io/projected/a21bb194-d03c-4eba-bf1f-ac216685a432-kube-api-access-pnzmw\") pod \"a21bb194-d03c-4eba-bf1f-ac216685a432\" (UID: \"a21bb194-d03c-4eba-bf1f-ac216685a432\") " Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.292431 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-utilities" (OuterVolumeSpecName: "utilities") pod "a21bb194-d03c-4eba-bf1f-ac216685a432" (UID: "a21bb194-d03c-4eba-bf1f-ac216685a432"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.293155 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.303506 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21bb194-d03c-4eba-bf1f-ac216685a432-kube-api-access-pnzmw" (OuterVolumeSpecName: "kube-api-access-pnzmw") pod "a21bb194-d03c-4eba-bf1f-ac216685a432" (UID: "a21bb194-d03c-4eba-bf1f-ac216685a432"). InnerVolumeSpecName "kube-api-access-pnzmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.395250 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnzmw\" (UniqueName: \"kubernetes.io/projected/a21bb194-d03c-4eba-bf1f-ac216685a432-kube-api-access-pnzmw\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.440981 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a21bb194-d03c-4eba-bf1f-ac216685a432" (UID: "a21bb194-d03c-4eba-bf1f-ac216685a432"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.497303 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a21bb194-d03c-4eba-bf1f-ac216685a432-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.656231 4892 generic.go:334] "Generic (PLEG): container finished" podID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerID="46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1" exitCode=0 Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.656275 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbhwk" event={"ID":"a21bb194-d03c-4eba-bf1f-ac216685a432","Type":"ContainerDied","Data":"46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1"} Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.656302 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bbhwk" event={"ID":"a21bb194-d03c-4eba-bf1f-ac216685a432","Type":"ContainerDied","Data":"2331830ff2ed8f008ddc52fa6e47d91fceacd7e21016f5e6e72ad93181d074e6"} Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.656332 4892 scope.go:117] "RemoveContainer" containerID="46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.656490 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bbhwk" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.697001 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bbhwk"] Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.706304 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bbhwk"] Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.711092 4892 scope.go:117] "RemoveContainer" containerID="7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.736172 4892 scope.go:117] "RemoveContainer" containerID="c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.784952 4892 scope.go:117] "RemoveContainer" containerID="46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1" Oct 06 12:42:30 crc kubenswrapper[4892]: E1006 12:42:30.785344 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1\": container with ID starting with 46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1 not found: ID does not exist" containerID="46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.785375 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1"} err="failed to get container status \"46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1\": rpc error: code = NotFound desc = could not find container \"46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1\": container with ID starting with 46e4f819f2f69093c14e07e07e3cc5249ca7a292eda2b4ff3eb69d8700e400c1 not found: ID does not exist" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.785396 4892 scope.go:117] "RemoveContainer" containerID="7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b" Oct 06 12:42:30 crc kubenswrapper[4892]: E1006 12:42:30.785798 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b\": container with ID starting with 7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b not found: ID does not exist" containerID="7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.785862 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b"} err="failed to get container status \"7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b\": rpc error: code = NotFound desc = could not find container \"7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b\": container with ID starting with 7e78836a3e994140b8257b5ca6d016ec472a283eeec847710c2d70340677b85b not found: ID does not exist" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.785904 4892 scope.go:117] "RemoveContainer" containerID="c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef" Oct 06 12:42:30 crc kubenswrapper[4892]: E1006 12:42:30.786446 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef\": container with ID starting with c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef not found: ID does not exist" containerID="c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef" Oct 06 12:42:30 crc kubenswrapper[4892]: I1006 12:42:30.786483 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef"} err="failed to get container status \"c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef\": rpc error: code = NotFound desc = could not find container \"c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef\": container with ID starting with c6949f3bc3630b0667b4453da4c46c622f3688768fd19907a03deb154c9dc2ef not found: ID does not exist" Oct 06 12:42:32 crc kubenswrapper[4892]: I1006 12:42:32.195998 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" path="/var/lib/kubelet/pods/a21bb194-d03c-4eba-bf1f-ac216685a432/volumes" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.834943 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tstgr"] Oct 06 12:42:57 crc kubenswrapper[4892]: E1006 12:42:57.836225 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="registry-server" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.836248 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="registry-server" Oct 06 12:42:57 crc kubenswrapper[4892]: E1006 12:42:57.836310 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="extract-utilities" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.837930 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="extract-utilities" Oct 06 12:42:57 crc kubenswrapper[4892]: E1006 12:42:57.837999 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="extract-content" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.838033 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="extract-content" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.838435 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21bb194-d03c-4eba-bf1f-ac216685a432" containerName="registry-server" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.841044 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.857647 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tstgr"] Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.929765 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljkdc\" (UniqueName: \"kubernetes.io/projected/b3f756a4-2078-49ca-b6b8-716ef5263b06-kube-api-access-ljkdc\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.930142 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-catalog-content\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:57 crc kubenswrapper[4892]: I1006 12:42:57.930345 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-utilities\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:58 crc kubenswrapper[4892]: I1006 12:42:58.032597 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-utilities\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:58 crc kubenswrapper[4892]: I1006 12:42:58.032742 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljkdc\" (UniqueName: \"kubernetes.io/projected/b3f756a4-2078-49ca-b6b8-716ef5263b06-kube-api-access-ljkdc\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:58 crc kubenswrapper[4892]: I1006 12:42:58.032846 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-catalog-content\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:58 crc kubenswrapper[4892]: I1006 12:42:58.033170 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-utilities\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:58 crc kubenswrapper[4892]: I1006 12:42:58.033472 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-catalog-content\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:58 crc kubenswrapper[4892]: I1006 12:42:58.070423 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljkdc\" (UniqueName: \"kubernetes.io/projected/b3f756a4-2078-49ca-b6b8-716ef5263b06-kube-api-access-ljkdc\") pod \"redhat-marketplace-tstgr\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:58 crc kubenswrapper[4892]: I1006 12:42:58.186988 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:42:58 crc kubenswrapper[4892]: I1006 12:42:58.657125 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tstgr"] Oct 06 12:42:59 crc kubenswrapper[4892]: I1006 12:42:59.004275 4892 generic.go:334] "Generic (PLEG): container finished" podID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerID="8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11" exitCode=0 Oct 06 12:42:59 crc kubenswrapper[4892]: I1006 12:42:59.004345 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tstgr" event={"ID":"b3f756a4-2078-49ca-b6b8-716ef5263b06","Type":"ContainerDied","Data":"8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11"} Oct 06 12:42:59 crc kubenswrapper[4892]: I1006 12:42:59.004376 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tstgr" event={"ID":"b3f756a4-2078-49ca-b6b8-716ef5263b06","Type":"ContainerStarted","Data":"641bdd2913e38d4642709c6b51f802337a170e16683a3bfeedaed44888bcdfc7"} Oct 06 12:43:01 crc kubenswrapper[4892]: I1006 12:43:01.029501 4892 generic.go:334] "Generic (PLEG): container finished" podID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerID="80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9" exitCode=0 Oct 06 12:43:01 crc kubenswrapper[4892]: I1006 12:43:01.029637 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tstgr" event={"ID":"b3f756a4-2078-49ca-b6b8-716ef5263b06","Type":"ContainerDied","Data":"80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9"} Oct 06 12:43:02 crc kubenswrapper[4892]: I1006 12:43:02.045391 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tstgr" event={"ID":"b3f756a4-2078-49ca-b6b8-716ef5263b06","Type":"ContainerStarted","Data":"8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63"} Oct 06 12:43:02 crc kubenswrapper[4892]: I1006 12:43:02.069155 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tstgr" podStartSLOduration=2.577207339 podStartE2EDuration="5.069125833s" podCreationTimestamp="2025-10-06 12:42:57 +0000 UTC" firstStartedPulling="2025-10-06 12:42:59.008094065 +0000 UTC m=+2065.557799880" lastFinishedPulling="2025-10-06 12:43:01.500012609 +0000 UTC m=+2068.049718374" observedRunningTime="2025-10-06 12:43:02.062659787 +0000 UTC m=+2068.612365552" watchObservedRunningTime="2025-10-06 12:43:02.069125833 +0000 UTC m=+2068.618831608" Oct 06 12:43:08 crc kubenswrapper[4892]: I1006 12:43:08.187119 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:43:08 crc kubenswrapper[4892]: I1006 12:43:08.187746 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:43:08 crc kubenswrapper[4892]: I1006 12:43:08.247023 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:43:09 crc kubenswrapper[4892]: I1006 12:43:09.200254 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:43:09 crc kubenswrapper[4892]: I1006 12:43:09.255178 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tstgr"] Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.143351 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tstgr" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerName="registry-server" containerID="cri-o://8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63" gracePeriod=2 Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.594636 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.637840 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljkdc\" (UniqueName: \"kubernetes.io/projected/b3f756a4-2078-49ca-b6b8-716ef5263b06-kube-api-access-ljkdc\") pod \"b3f756a4-2078-49ca-b6b8-716ef5263b06\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.637993 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-catalog-content\") pod \"b3f756a4-2078-49ca-b6b8-716ef5263b06\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.638066 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-utilities\") pod \"b3f756a4-2078-49ca-b6b8-716ef5263b06\" (UID: \"b3f756a4-2078-49ca-b6b8-716ef5263b06\") " Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.639309 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-utilities" (OuterVolumeSpecName: "utilities") pod "b3f756a4-2078-49ca-b6b8-716ef5263b06" (UID: "b3f756a4-2078-49ca-b6b8-716ef5263b06"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.644207 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3f756a4-2078-49ca-b6b8-716ef5263b06-kube-api-access-ljkdc" (OuterVolumeSpecName: "kube-api-access-ljkdc") pod "b3f756a4-2078-49ca-b6b8-716ef5263b06" (UID: "b3f756a4-2078-49ca-b6b8-716ef5263b06"). InnerVolumeSpecName "kube-api-access-ljkdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.652754 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3f756a4-2078-49ca-b6b8-716ef5263b06" (UID: "b3f756a4-2078-49ca-b6b8-716ef5263b06"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.740334 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.740368 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljkdc\" (UniqueName: \"kubernetes.io/projected/b3f756a4-2078-49ca-b6b8-716ef5263b06-kube-api-access-ljkdc\") on node \"crc\" DevicePath \"\"" Oct 06 12:43:11 crc kubenswrapper[4892]: I1006 12:43:11.740383 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3f756a4-2078-49ca-b6b8-716ef5263b06-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.164417 4892 generic.go:334] "Generic (PLEG): container finished" podID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerID="8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63" exitCode=0 Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.164486 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tstgr" event={"ID":"b3f756a4-2078-49ca-b6b8-716ef5263b06","Type":"ContainerDied","Data":"8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63"} Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.164801 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tstgr" event={"ID":"b3f756a4-2078-49ca-b6b8-716ef5263b06","Type":"ContainerDied","Data":"641bdd2913e38d4642709c6b51f802337a170e16683a3bfeedaed44888bcdfc7"} Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.164830 4892 scope.go:117] "RemoveContainer" containerID="8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.164508 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tstgr" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.212060 4892 scope.go:117] "RemoveContainer" containerID="80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.231217 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tstgr"] Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.245988 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tstgr"] Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.249453 4892 scope.go:117] "RemoveContainer" containerID="8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.293448 4892 scope.go:117] "RemoveContainer" containerID="8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63" Oct 06 12:43:12 crc kubenswrapper[4892]: E1006 12:43:12.293894 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63\": container with ID starting with 8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63 not found: ID does not exist" containerID="8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.293975 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63"} err="failed to get container status \"8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63\": rpc error: code = NotFound desc = could not find container \"8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63\": container with ID starting with 8135908a20c6dc4f91e84cb58a8445fe8a626da03feb3dcfbd6d27645a2d9b63 not found: ID does not exist" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.294018 4892 scope.go:117] "RemoveContainer" containerID="80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9" Oct 06 12:43:12 crc kubenswrapper[4892]: E1006 12:43:12.294388 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9\": container with ID starting with 80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9 not found: ID does not exist" containerID="80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.294426 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9"} err="failed to get container status \"80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9\": rpc error: code = NotFound desc = could not find container \"80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9\": container with ID starting with 80b572e1aaa9ffb519cca42e1a6c304152dfd824c2d6cfa5ab369236a61579c9 not found: ID does not exist" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.294453 4892 scope.go:117] "RemoveContainer" containerID="8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11" Oct 06 12:43:12 crc kubenswrapper[4892]: E1006 12:43:12.294774 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11\": container with ID starting with 8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11 not found: ID does not exist" containerID="8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11" Oct 06 12:43:12 crc kubenswrapper[4892]: I1006 12:43:12.294818 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11"} err="failed to get container status \"8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11\": rpc error: code = NotFound desc = could not find container \"8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11\": container with ID starting with 8afeaf9a36d69a476a418115e02bdf0e8ff93c9eb08a5baccde049ebdbce9b11 not found: ID does not exist" Oct 06 12:43:14 crc kubenswrapper[4892]: I1006 12:43:14.184967 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" path="/var/lib/kubelet/pods/b3f756a4-2078-49ca-b6b8-716ef5263b06/volumes" Oct 06 12:43:22 crc kubenswrapper[4892]: I1006 12:43:22.298908 4892 generic.go:334] "Generic (PLEG): container finished" podID="073e303e-602b-4cce-b0c5-f9da295a63a4" containerID="99bb33f9dccadf80261aaeb3f6f29dd6e8bd715b70a56e184b8510b26d28563b" exitCode=0 Oct 06 12:43:22 crc kubenswrapper[4892]: I1006 12:43:22.298984 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" event={"ID":"073e303e-602b-4cce-b0c5-f9da295a63a4","Type":"ContainerDied","Data":"99bb33f9dccadf80261aaeb3f6f29dd6e8bd715b70a56e184b8510b26d28563b"} Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.831364 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.906776 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt9b2\" (UniqueName: \"kubernetes.io/projected/073e303e-602b-4cce-b0c5-f9da295a63a4-kube-api-access-rt9b2\") pod \"073e303e-602b-4cce-b0c5-f9da295a63a4\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.906839 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ssh-key\") pod \"073e303e-602b-4cce-b0c5-f9da295a63a4\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.907077 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/073e303e-602b-4cce-b0c5-f9da295a63a4-ovncontroller-config-0\") pod \"073e303e-602b-4cce-b0c5-f9da295a63a4\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.907215 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ovn-combined-ca-bundle\") pod \"073e303e-602b-4cce-b0c5-f9da295a63a4\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.907622 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-inventory\") pod \"073e303e-602b-4cce-b0c5-f9da295a63a4\" (UID: \"073e303e-602b-4cce-b0c5-f9da295a63a4\") " Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.919701 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "073e303e-602b-4cce-b0c5-f9da295a63a4" (UID: "073e303e-602b-4cce-b0c5-f9da295a63a4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.920039 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073e303e-602b-4cce-b0c5-f9da295a63a4-kube-api-access-rt9b2" (OuterVolumeSpecName: "kube-api-access-rt9b2") pod "073e303e-602b-4cce-b0c5-f9da295a63a4" (UID: "073e303e-602b-4cce-b0c5-f9da295a63a4"). InnerVolumeSpecName "kube-api-access-rt9b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.938634 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073e303e-602b-4cce-b0c5-f9da295a63a4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "073e303e-602b-4cce-b0c5-f9da295a63a4" (UID: "073e303e-602b-4cce-b0c5-f9da295a63a4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.962763 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-inventory" (OuterVolumeSpecName: "inventory") pod "073e303e-602b-4cce-b0c5-f9da295a63a4" (UID: "073e303e-602b-4cce-b0c5-f9da295a63a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:43:23 crc kubenswrapper[4892]: I1006 12:43:23.976525 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "073e303e-602b-4cce-b0c5-f9da295a63a4" (UID: "073e303e-602b-4cce-b0c5-f9da295a63a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.010173 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.010453 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt9b2\" (UniqueName: \"kubernetes.io/projected/073e303e-602b-4cce-b0c5-f9da295a63a4-kube-api-access-rt9b2\") on node \"crc\" DevicePath \"\"" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.010595 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.010700 4892 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/073e303e-602b-4cce-b0c5-f9da295a63a4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.010843 4892 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/073e303e-602b-4cce-b0c5-f9da295a63a4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.344440 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" event={"ID":"073e303e-602b-4cce-b0c5-f9da295a63a4","Type":"ContainerDied","Data":"e15cae6a5403dbb71f6d56abe5e69e0fae624c34fca132b7313fe89982f27144"} Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.344499 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e15cae6a5403dbb71f6d56abe5e69e0fae624c34fca132b7313fe89982f27144" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.344615 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-zwdjq" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.452854 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk"] Oct 06 12:43:24 crc kubenswrapper[4892]: E1006 12:43:24.453392 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073e303e-602b-4cce-b0c5-f9da295a63a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.453412 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="073e303e-602b-4cce-b0c5-f9da295a63a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 12:43:24 crc kubenswrapper[4892]: E1006 12:43:24.453426 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerName="extract-utilities" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.453435 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerName="extract-utilities" Oct 06 12:43:24 crc kubenswrapper[4892]: E1006 12:43:24.453474 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerName="registry-server" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.453482 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerName="registry-server" Oct 06 12:43:24 crc kubenswrapper[4892]: E1006 12:43:24.453498 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerName="extract-content" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.453506 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerName="extract-content" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.453735 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="073e303e-602b-4cce-b0c5-f9da295a63a4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.453759 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3f756a4-2078-49ca-b6b8-716ef5263b06" containerName="registry-server" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.454584 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.458202 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.458441 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.458687 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.458743 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.459067 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.459118 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.463370 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk"] Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.540433 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxfrb\" (UniqueName: \"kubernetes.io/projected/4341f2a6-b1f7-453a-84db-a4ba1888c381-kube-api-access-vxfrb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.540519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.540598 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.540672 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.540862 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.541193 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.642854 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.642990 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.643076 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.643170 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.643268 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.643446 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxfrb\" (UniqueName: \"kubernetes.io/projected/4341f2a6-b1f7-453a-84db-a4ba1888c381-kube-api-access-vxfrb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.648425 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.648698 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.648860 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.649601 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.651771 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.673534 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxfrb\" (UniqueName: \"kubernetes.io/projected/4341f2a6-b1f7-453a-84db-a4ba1888c381-kube-api-access-vxfrb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:24 crc kubenswrapper[4892]: I1006 12:43:24.790029 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:43:25 crc kubenswrapper[4892]: I1006 12:43:25.349190 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk"] Oct 06 12:43:25 crc kubenswrapper[4892]: W1006 12:43:25.358088 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4341f2a6_b1f7_453a_84db_a4ba1888c381.slice/crio-6b20a3ba232e9feac1cba92095430c287885bd8f3ae32b7b100b6bb48c8c2fac WatchSource:0}: Error finding container 6b20a3ba232e9feac1cba92095430c287885bd8f3ae32b7b100b6bb48c8c2fac: Status 404 returned error can't find the container with id 6b20a3ba232e9feac1cba92095430c287885bd8f3ae32b7b100b6bb48c8c2fac Oct 06 12:43:26 crc kubenswrapper[4892]: I1006 12:43:26.369814 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" event={"ID":"4341f2a6-b1f7-453a-84db-a4ba1888c381","Type":"ContainerStarted","Data":"de9fa30484687678c65dfa42ded53421fc025bd42c5421df232b9d774e83aa39"} Oct 06 12:43:26 crc kubenswrapper[4892]: I1006 12:43:26.370612 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" event={"ID":"4341f2a6-b1f7-453a-84db-a4ba1888c381","Type":"ContainerStarted","Data":"6b20a3ba232e9feac1cba92095430c287885bd8f3ae32b7b100b6bb48c8c2fac"} Oct 06 12:43:26 crc kubenswrapper[4892]: I1006 12:43:26.405666 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" podStartSLOduration=1.967205756 podStartE2EDuration="2.405640325s" podCreationTimestamp="2025-10-06 12:43:24 +0000 UTC" firstStartedPulling="2025-10-06 12:43:25.361304959 +0000 UTC m=+2091.911010734" lastFinishedPulling="2025-10-06 12:43:25.799739508 +0000 UTC m=+2092.349445303" observedRunningTime="2025-10-06 12:43:26.389073046 +0000 UTC m=+2092.938778821" watchObservedRunningTime="2025-10-06 12:43:26.405640325 +0000 UTC m=+2092.955346130" Oct 06 12:44:22 crc kubenswrapper[4892]: I1006 12:44:22.985574 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:44:22 crc kubenswrapper[4892]: I1006 12:44:22.986252 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:44:23 crc kubenswrapper[4892]: I1006 12:44:23.011542 4892 generic.go:334] "Generic (PLEG): container finished" podID="4341f2a6-b1f7-453a-84db-a4ba1888c381" containerID="de9fa30484687678c65dfa42ded53421fc025bd42c5421df232b9d774e83aa39" exitCode=0 Oct 06 12:44:23 crc kubenswrapper[4892]: I1006 12:44:23.011588 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" event={"ID":"4341f2a6-b1f7-453a-84db-a4ba1888c381","Type":"ContainerDied","Data":"de9fa30484687678c65dfa42ded53421fc025bd42c5421df232b9d774e83aa39"} Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.483533 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.615216 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxfrb\" (UniqueName: \"kubernetes.io/projected/4341f2a6-b1f7-453a-84db-a4ba1888c381-kube-api-access-vxfrb\") pod \"4341f2a6-b1f7-453a-84db-a4ba1888c381\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.615284 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4341f2a6-b1f7-453a-84db-a4ba1888c381\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.615369 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-ssh-key\") pod \"4341f2a6-b1f7-453a-84db-a4ba1888c381\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.615420 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-nova-metadata-neutron-config-0\") pod \"4341f2a6-b1f7-453a-84db-a4ba1888c381\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.615583 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-metadata-combined-ca-bundle\") pod \"4341f2a6-b1f7-453a-84db-a4ba1888c381\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.615681 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-inventory\") pod \"4341f2a6-b1f7-453a-84db-a4ba1888c381\" (UID: \"4341f2a6-b1f7-453a-84db-a4ba1888c381\") " Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.630022 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4341f2a6-b1f7-453a-84db-a4ba1888c381-kube-api-access-vxfrb" (OuterVolumeSpecName: "kube-api-access-vxfrb") pod "4341f2a6-b1f7-453a-84db-a4ba1888c381" (UID: "4341f2a6-b1f7-453a-84db-a4ba1888c381"). InnerVolumeSpecName "kube-api-access-vxfrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.630173 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4341f2a6-b1f7-453a-84db-a4ba1888c381" (UID: "4341f2a6-b1f7-453a-84db-a4ba1888c381"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.643338 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4341f2a6-b1f7-453a-84db-a4ba1888c381" (UID: "4341f2a6-b1f7-453a-84db-a4ba1888c381"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.645889 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4341f2a6-b1f7-453a-84db-a4ba1888c381" (UID: "4341f2a6-b1f7-453a-84db-a4ba1888c381"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.649650 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-inventory" (OuterVolumeSpecName: "inventory") pod "4341f2a6-b1f7-453a-84db-a4ba1888c381" (UID: "4341f2a6-b1f7-453a-84db-a4ba1888c381"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.655514 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4341f2a6-b1f7-453a-84db-a4ba1888c381" (UID: "4341f2a6-b1f7-453a-84db-a4ba1888c381"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.717896 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.717930 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.717945 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxfrb\" (UniqueName: \"kubernetes.io/projected/4341f2a6-b1f7-453a-84db-a4ba1888c381-kube-api-access-vxfrb\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.717960 4892 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.717971 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:24 crc kubenswrapper[4892]: I1006 12:44:24.717984 4892 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4341f2a6-b1f7-453a-84db-a4ba1888c381-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.036249 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" event={"ID":"4341f2a6-b1f7-453a-84db-a4ba1888c381","Type":"ContainerDied","Data":"6b20a3ba232e9feac1cba92095430c287885bd8f3ae32b7b100b6bb48c8c2fac"} Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.036293 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b20a3ba232e9feac1cba92095430c287885bd8f3ae32b7b100b6bb48c8c2fac" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.036325 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.141448 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq"] Oct 06 12:44:25 crc kubenswrapper[4892]: E1006 12:44:25.141997 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4341f2a6-b1f7-453a-84db-a4ba1888c381" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.142022 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4341f2a6-b1f7-453a-84db-a4ba1888c381" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.142302 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4341f2a6-b1f7-453a-84db-a4ba1888c381" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.143131 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.145718 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.148922 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.149183 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.149380 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.149876 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.154294 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq"] Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.227663 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62l5\" (UniqueName: \"kubernetes.io/projected/aa2cf17a-4ca9-414f-9421-46fa314679d0-kube-api-access-q62l5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.227723 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.228047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.228829 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.229550 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.331592 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.331691 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.331772 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.331857 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q62l5\" (UniqueName: \"kubernetes.io/projected/aa2cf17a-4ca9-414f-9421-46fa314679d0-kube-api-access-q62l5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.331888 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.336069 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.336255 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.336274 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.336443 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.349064 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62l5\" (UniqueName: \"kubernetes.io/projected/aa2cf17a-4ca9-414f-9421-46fa314679d0-kube-api-access-q62l5\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6njgq\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:25 crc kubenswrapper[4892]: I1006 12:44:25.463158 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.026952 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq"] Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.050573 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" event={"ID":"aa2cf17a-4ca9-414f-9421-46fa314679d0","Type":"ContainerStarted","Data":"df01bf2569fca879c0cef8dde6cb292cdf75e28f986afa6e5126f306aa110046"} Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.246633 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t5f25"] Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.250290 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.261672 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t5f25"] Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.354281 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87nnw\" (UniqueName: \"kubernetes.io/projected/31a94b4c-8491-47eb-8c83-b1ba62d5da66-kube-api-access-87nnw\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.354941 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-catalog-content\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.355405 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-utilities\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.457912 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-catalog-content\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.458058 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-utilities\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.458101 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87nnw\" (UniqueName: \"kubernetes.io/projected/31a94b4c-8491-47eb-8c83-b1ba62d5da66-kube-api-access-87nnw\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.458630 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-catalog-content\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.458819 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-utilities\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.490046 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87nnw\" (UniqueName: \"kubernetes.io/projected/31a94b4c-8491-47eb-8c83-b1ba62d5da66-kube-api-access-87nnw\") pod \"certified-operators-t5f25\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:26 crc kubenswrapper[4892]: I1006 12:44:26.584655 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:27 crc kubenswrapper[4892]: I1006 12:44:27.061522 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" event={"ID":"aa2cf17a-4ca9-414f-9421-46fa314679d0","Type":"ContainerStarted","Data":"37e8ea72bbfffb2258b60084302aedfc8f033691f4e4f56002df08f84cff337c"} Oct 06 12:44:27 crc kubenswrapper[4892]: I1006 12:44:27.082768 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" podStartSLOduration=1.583450176 podStartE2EDuration="2.082749707s" podCreationTimestamp="2025-10-06 12:44:25 +0000 UTC" firstStartedPulling="2025-10-06 12:44:26.032520838 +0000 UTC m=+2152.582226603" lastFinishedPulling="2025-10-06 12:44:26.531820369 +0000 UTC m=+2153.081526134" observedRunningTime="2025-10-06 12:44:27.07763222 +0000 UTC m=+2153.627337995" watchObservedRunningTime="2025-10-06 12:44:27.082749707 +0000 UTC m=+2153.632455472" Oct 06 12:44:27 crc kubenswrapper[4892]: I1006 12:44:27.113932 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t5f25"] Oct 06 12:44:28 crc kubenswrapper[4892]: I1006 12:44:28.070617 4892 generic.go:334] "Generic (PLEG): container finished" podID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerID="4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552" exitCode=0 Oct 06 12:44:28 crc kubenswrapper[4892]: I1006 12:44:28.070773 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t5f25" event={"ID":"31a94b4c-8491-47eb-8c83-b1ba62d5da66","Type":"ContainerDied","Data":"4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552"} Oct 06 12:44:28 crc kubenswrapper[4892]: I1006 12:44:28.071287 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t5f25" event={"ID":"31a94b4c-8491-47eb-8c83-b1ba62d5da66","Type":"ContainerStarted","Data":"c4a01ec785f3e89ec55b878488b1d74d8aa3b9968443f5ee135483b472fc2fb8"} Oct 06 12:44:29 crc kubenswrapper[4892]: I1006 12:44:29.087669 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t5f25" event={"ID":"31a94b4c-8491-47eb-8c83-b1ba62d5da66","Type":"ContainerStarted","Data":"9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901"} Oct 06 12:44:30 crc kubenswrapper[4892]: I1006 12:44:30.100039 4892 generic.go:334] "Generic (PLEG): container finished" podID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerID="9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901" exitCode=0 Oct 06 12:44:30 crc kubenswrapper[4892]: I1006 12:44:30.100087 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t5f25" event={"ID":"31a94b4c-8491-47eb-8c83-b1ba62d5da66","Type":"ContainerDied","Data":"9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901"} Oct 06 12:44:31 crc kubenswrapper[4892]: I1006 12:44:31.111035 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t5f25" event={"ID":"31a94b4c-8491-47eb-8c83-b1ba62d5da66","Type":"ContainerStarted","Data":"43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338"} Oct 06 12:44:31 crc kubenswrapper[4892]: I1006 12:44:31.140969 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t5f25" podStartSLOduration=2.670285884 podStartE2EDuration="5.140945062s" podCreationTimestamp="2025-10-06 12:44:26 +0000 UTC" firstStartedPulling="2025-10-06 12:44:28.072865404 +0000 UTC m=+2154.622571189" lastFinishedPulling="2025-10-06 12:44:30.543524582 +0000 UTC m=+2157.093230367" observedRunningTime="2025-10-06 12:44:31.130697347 +0000 UTC m=+2157.680403102" watchObservedRunningTime="2025-10-06 12:44:31.140945062 +0000 UTC m=+2157.690650837" Oct 06 12:44:36 crc kubenswrapper[4892]: I1006 12:44:36.585523 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:36 crc kubenswrapper[4892]: I1006 12:44:36.586712 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:36 crc kubenswrapper[4892]: I1006 12:44:36.647147 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:37 crc kubenswrapper[4892]: I1006 12:44:37.229831 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:37 crc kubenswrapper[4892]: I1006 12:44:37.293811 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t5f25"] Oct 06 12:44:39 crc kubenswrapper[4892]: I1006 12:44:39.199225 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t5f25" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerName="registry-server" containerID="cri-o://43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338" gracePeriod=2 Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.155024 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.214799 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-utilities\") pod \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.215400 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-catalog-content\") pod \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.215520 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87nnw\" (UniqueName: \"kubernetes.io/projected/31a94b4c-8491-47eb-8c83-b1ba62d5da66-kube-api-access-87nnw\") pod \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\" (UID: \"31a94b4c-8491-47eb-8c83-b1ba62d5da66\") " Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.217383 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-utilities" (OuterVolumeSpecName: "utilities") pod "31a94b4c-8491-47eb-8c83-b1ba62d5da66" (UID: "31a94b4c-8491-47eb-8c83-b1ba62d5da66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.265559 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a94b4c-8491-47eb-8c83-b1ba62d5da66-kube-api-access-87nnw" (OuterVolumeSpecName: "kube-api-access-87nnw") pod "31a94b4c-8491-47eb-8c83-b1ba62d5da66" (UID: "31a94b4c-8491-47eb-8c83-b1ba62d5da66"). InnerVolumeSpecName "kube-api-access-87nnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.317888 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87nnw\" (UniqueName: \"kubernetes.io/projected/31a94b4c-8491-47eb-8c83-b1ba62d5da66-kube-api-access-87nnw\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.317917 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.320794 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31a94b4c-8491-47eb-8c83-b1ba62d5da66" (UID: "31a94b4c-8491-47eb-8c83-b1ba62d5da66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.321302 4892 generic.go:334] "Generic (PLEG): container finished" podID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerID="43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338" exitCode=0 Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.321376 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t5f25" event={"ID":"31a94b4c-8491-47eb-8c83-b1ba62d5da66","Type":"ContainerDied","Data":"43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338"} Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.321410 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t5f25" event={"ID":"31a94b4c-8491-47eb-8c83-b1ba62d5da66","Type":"ContainerDied","Data":"c4a01ec785f3e89ec55b878488b1d74d8aa3b9968443f5ee135483b472fc2fb8"} Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.321430 4892 scope.go:117] "RemoveContainer" containerID="43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.321633 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t5f25" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.364342 4892 scope.go:117] "RemoveContainer" containerID="9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.391300 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t5f25"] Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.397392 4892 scope.go:117] "RemoveContainer" containerID="4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.403593 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t5f25"] Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.419372 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a94b4c-8491-47eb-8c83-b1ba62d5da66-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.428349 4892 scope.go:117] "RemoveContainer" containerID="43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338" Oct 06 12:44:40 crc kubenswrapper[4892]: E1006 12:44:40.428837 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338\": container with ID starting with 43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338 not found: ID does not exist" containerID="43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.428879 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338"} err="failed to get container status \"43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338\": rpc error: code = NotFound desc = could not find container \"43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338\": container with ID starting with 43fb9577784da04ea899b197c6bc19412c696518036298295034d3e0772f8338 not found: ID does not exist" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.428909 4892 scope.go:117] "RemoveContainer" containerID="9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901" Oct 06 12:44:40 crc kubenswrapper[4892]: E1006 12:44:40.429267 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901\": container with ID starting with 9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901 not found: ID does not exist" containerID="9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.429313 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901"} err="failed to get container status \"9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901\": rpc error: code = NotFound desc = could not find container \"9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901\": container with ID starting with 9b38ca90f7dec74d13bbe825419b1b824630c4ba2aacf89063203bcfb6e04901 not found: ID does not exist" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.429360 4892 scope.go:117] "RemoveContainer" containerID="4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552" Oct 06 12:44:40 crc kubenswrapper[4892]: E1006 12:44:40.429685 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552\": container with ID starting with 4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552 not found: ID does not exist" containerID="4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552" Oct 06 12:44:40 crc kubenswrapper[4892]: I1006 12:44:40.429717 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552"} err="failed to get container status \"4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552\": rpc error: code = NotFound desc = could not find container \"4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552\": container with ID starting with 4c90dd546eb503eb2b945f95aa1c1f07c840a7d98eacf6f6f0fd6606d2588552 not found: ID does not exist" Oct 06 12:44:42 crc kubenswrapper[4892]: I1006 12:44:42.186932 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" path="/var/lib/kubelet/pods/31a94b4c-8491-47eb-8c83-b1ba62d5da66/volumes" Oct 06 12:44:52 crc kubenswrapper[4892]: I1006 12:44:52.986503 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:44:52 crc kubenswrapper[4892]: I1006 12:44:52.987093 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.153651 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm"] Oct 06 12:45:00 crc kubenswrapper[4892]: E1006 12:45:00.154630 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerName="registry-server" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.154650 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerName="registry-server" Oct 06 12:45:00 crc kubenswrapper[4892]: E1006 12:45:00.154668 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerName="extract-utilities" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.154676 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerName="extract-utilities" Oct 06 12:45:00 crc kubenswrapper[4892]: E1006 12:45:00.154705 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerName="extract-content" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.154714 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerName="extract-content" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.154955 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a94b4c-8491-47eb-8c83-b1ba62d5da66" containerName="registry-server" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.155820 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.158196 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.161159 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.192212 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm"] Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.233574 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lg9\" (UniqueName: \"kubernetes.io/projected/7e6d00e2-2088-4bd1-8269-2337f0421848-kube-api-access-x4lg9\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.233680 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e6d00e2-2088-4bd1-8269-2337f0421848-secret-volume\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.233714 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e6d00e2-2088-4bd1-8269-2337f0421848-config-volume\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.335587 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lg9\" (UniqueName: \"kubernetes.io/projected/7e6d00e2-2088-4bd1-8269-2337f0421848-kube-api-access-x4lg9\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.335913 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e6d00e2-2088-4bd1-8269-2337f0421848-secret-volume\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.335944 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e6d00e2-2088-4bd1-8269-2337f0421848-config-volume\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.337418 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e6d00e2-2088-4bd1-8269-2337f0421848-config-volume\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.346704 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e6d00e2-2088-4bd1-8269-2337f0421848-secret-volume\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.355860 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lg9\" (UniqueName: \"kubernetes.io/projected/7e6d00e2-2088-4bd1-8269-2337f0421848-kube-api-access-x4lg9\") pod \"collect-profiles-29329245-b58zm\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.487479 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:00 crc kubenswrapper[4892]: I1006 12:45:00.992783 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm"] Oct 06 12:45:01 crc kubenswrapper[4892]: I1006 12:45:01.561919 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" event={"ID":"7e6d00e2-2088-4bd1-8269-2337f0421848","Type":"ContainerStarted","Data":"40004991028aa620cdd7a290a3f55479f2ee85945621d94e3471ed91dfa56c30"} Oct 06 12:45:01 crc kubenswrapper[4892]: I1006 12:45:01.563229 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" event={"ID":"7e6d00e2-2088-4bd1-8269-2337f0421848","Type":"ContainerStarted","Data":"84d8197642176f3d1dd0d849edb778bb67cfb4ca6c026d0fb936b1979c21c425"} Oct 06 12:45:01 crc kubenswrapper[4892]: I1006 12:45:01.586527 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" podStartSLOduration=1.586502791 podStartE2EDuration="1.586502791s" podCreationTimestamp="2025-10-06 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:45:01.577483741 +0000 UTC m=+2188.127189506" watchObservedRunningTime="2025-10-06 12:45:01.586502791 +0000 UTC m=+2188.136208576" Oct 06 12:45:02 crc kubenswrapper[4892]: I1006 12:45:02.574399 4892 generic.go:334] "Generic (PLEG): container finished" podID="7e6d00e2-2088-4bd1-8269-2337f0421848" containerID="40004991028aa620cdd7a290a3f55479f2ee85945621d94e3471ed91dfa56c30" exitCode=0 Oct 06 12:45:02 crc kubenswrapper[4892]: I1006 12:45:02.574551 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" event={"ID":"7e6d00e2-2088-4bd1-8269-2337f0421848","Type":"ContainerDied","Data":"40004991028aa620cdd7a290a3f55479f2ee85945621d94e3471ed91dfa56c30"} Oct 06 12:45:03 crc kubenswrapper[4892]: I1006 12:45:03.940746 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.017924 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e6d00e2-2088-4bd1-8269-2337f0421848-secret-volume\") pod \"7e6d00e2-2088-4bd1-8269-2337f0421848\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.018006 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e6d00e2-2088-4bd1-8269-2337f0421848-config-volume\") pod \"7e6d00e2-2088-4bd1-8269-2337f0421848\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.018090 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4lg9\" (UniqueName: \"kubernetes.io/projected/7e6d00e2-2088-4bd1-8269-2337f0421848-kube-api-access-x4lg9\") pod \"7e6d00e2-2088-4bd1-8269-2337f0421848\" (UID: \"7e6d00e2-2088-4bd1-8269-2337f0421848\") " Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.019140 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6d00e2-2088-4bd1-8269-2337f0421848-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e6d00e2-2088-4bd1-8269-2337f0421848" (UID: "7e6d00e2-2088-4bd1-8269-2337f0421848"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.027516 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6d00e2-2088-4bd1-8269-2337f0421848-kube-api-access-x4lg9" (OuterVolumeSpecName: "kube-api-access-x4lg9") pod "7e6d00e2-2088-4bd1-8269-2337f0421848" (UID: "7e6d00e2-2088-4bd1-8269-2337f0421848"). InnerVolumeSpecName "kube-api-access-x4lg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.031603 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6d00e2-2088-4bd1-8269-2337f0421848-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e6d00e2-2088-4bd1-8269-2337f0421848" (UID: "7e6d00e2-2088-4bd1-8269-2337f0421848"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.120456 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4lg9\" (UniqueName: \"kubernetes.io/projected/7e6d00e2-2088-4bd1-8269-2337f0421848-kube-api-access-x4lg9\") on node \"crc\" DevicePath \"\"" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.120501 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e6d00e2-2088-4bd1-8269-2337f0421848-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.120516 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e6d00e2-2088-4bd1-8269-2337f0421848-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.600094 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" event={"ID":"7e6d00e2-2088-4bd1-8269-2337f0421848","Type":"ContainerDied","Data":"84d8197642176f3d1dd0d849edb778bb67cfb4ca6c026d0fb936b1979c21c425"} Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.600153 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84d8197642176f3d1dd0d849edb778bb67cfb4ca6c026d0fb936b1979c21c425" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.600223 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm" Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.676100 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw"] Oct 06 12:45:04 crc kubenswrapper[4892]: I1006 12:45:04.688171 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329200-mltkw"] Oct 06 12:45:06 crc kubenswrapper[4892]: I1006 12:45:06.183884 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27583c54-ec17-44a2-8240-224df02a4cbc" path="/var/lib/kubelet/pods/27583c54-ec17-44a2-8240-224df02a4cbc/volumes" Oct 06 12:45:22 crc kubenswrapper[4892]: I1006 12:45:22.984286 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:45:22 crc kubenswrapper[4892]: I1006 12:45:22.984984 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:45:22 crc kubenswrapper[4892]: I1006 12:45:22.985030 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:45:22 crc kubenswrapper[4892]: I1006 12:45:22.985588 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aaf00a7af53f2aef2fe68e84f3f54bc362b13cdfe9c13a663cf8163c9b021ad5"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:45:22 crc kubenswrapper[4892]: I1006 12:45:22.985640 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://aaf00a7af53f2aef2fe68e84f3f54bc362b13cdfe9c13a663cf8163c9b021ad5" gracePeriod=600 Oct 06 12:45:23 crc kubenswrapper[4892]: I1006 12:45:23.828884 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="aaf00a7af53f2aef2fe68e84f3f54bc362b13cdfe9c13a663cf8163c9b021ad5" exitCode=0 Oct 06 12:45:23 crc kubenswrapper[4892]: I1006 12:45:23.828963 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"aaf00a7af53f2aef2fe68e84f3f54bc362b13cdfe9c13a663cf8163c9b021ad5"} Oct 06 12:45:23 crc kubenswrapper[4892]: I1006 12:45:23.829462 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec"} Oct 06 12:45:23 crc kubenswrapper[4892]: I1006 12:45:23.829484 4892 scope.go:117] "RemoveContainer" containerID="9efcc1001cdef9eca4e8cd9cd682295d1c51c25017dbff95c41e203f78205262" Oct 06 12:46:02 crc kubenswrapper[4892]: I1006 12:46:02.299083 4892 scope.go:117] "RemoveContainer" containerID="df7c3cdb8251a98ff34a4de6decf8fbe7f743e3388925f7c2f38736e85756ee8" Oct 06 12:47:52 crc kubenswrapper[4892]: I1006 12:47:52.984373 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:47:52 crc kubenswrapper[4892]: I1006 12:47:52.985115 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.274477 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qhfg5"] Oct 06 12:48:05 crc kubenswrapper[4892]: E1006 12:48:05.276818 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6d00e2-2088-4bd1-8269-2337f0421848" containerName="collect-profiles" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.276960 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6d00e2-2088-4bd1-8269-2337f0421848" containerName="collect-profiles" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.277375 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6d00e2-2088-4bd1-8269-2337f0421848" containerName="collect-profiles" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.279004 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.294407 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhfg5"] Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.396202 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-utilities\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.396402 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-catalog-content\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.396438 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf7cp\" (UniqueName: \"kubernetes.io/projected/97129574-d37a-4fb6-835d-96fbcf1a2ec4-kube-api-access-qf7cp\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.500473 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-utilities\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.500646 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-catalog-content\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.500690 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf7cp\" (UniqueName: \"kubernetes.io/projected/97129574-d37a-4fb6-835d-96fbcf1a2ec4-kube-api-access-qf7cp\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.501529 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-utilities\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.501783 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-catalog-content\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.540810 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf7cp\" (UniqueName: \"kubernetes.io/projected/97129574-d37a-4fb6-835d-96fbcf1a2ec4-kube-api-access-qf7cp\") pod \"community-operators-qhfg5\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.610554 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:05 crc kubenswrapper[4892]: I1006 12:48:05.989755 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhfg5"] Oct 06 12:48:06 crc kubenswrapper[4892]: I1006 12:48:06.735366 4892 generic.go:334] "Generic (PLEG): container finished" podID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerID="1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e" exitCode=0 Oct 06 12:48:06 crc kubenswrapper[4892]: I1006 12:48:06.735483 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhfg5" event={"ID":"97129574-d37a-4fb6-835d-96fbcf1a2ec4","Type":"ContainerDied","Data":"1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e"} Oct 06 12:48:06 crc kubenswrapper[4892]: I1006 12:48:06.736474 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhfg5" event={"ID":"97129574-d37a-4fb6-835d-96fbcf1a2ec4","Type":"ContainerStarted","Data":"dee2e47b721f572c8c005672b1a91ac4ae3536c8a10f6f519204b163cb101272"} Oct 06 12:48:06 crc kubenswrapper[4892]: I1006 12:48:06.738738 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 12:48:08 crc kubenswrapper[4892]: I1006 12:48:08.757489 4892 generic.go:334] "Generic (PLEG): container finished" podID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerID="7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397" exitCode=0 Oct 06 12:48:08 crc kubenswrapper[4892]: I1006 12:48:08.757579 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhfg5" event={"ID":"97129574-d37a-4fb6-835d-96fbcf1a2ec4","Type":"ContainerDied","Data":"7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397"} Oct 06 12:48:09 crc kubenswrapper[4892]: I1006 12:48:09.771765 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhfg5" event={"ID":"97129574-d37a-4fb6-835d-96fbcf1a2ec4","Type":"ContainerStarted","Data":"3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5"} Oct 06 12:48:09 crc kubenswrapper[4892]: I1006 12:48:09.795010 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qhfg5" podStartSLOduration=2.364794454 podStartE2EDuration="4.794989085s" podCreationTimestamp="2025-10-06 12:48:05 +0000 UTC" firstStartedPulling="2025-10-06 12:48:06.738523632 +0000 UTC m=+2373.288229397" lastFinishedPulling="2025-10-06 12:48:09.168718233 +0000 UTC m=+2375.718424028" observedRunningTime="2025-10-06 12:48:09.790850326 +0000 UTC m=+2376.340556091" watchObservedRunningTime="2025-10-06 12:48:09.794989085 +0000 UTC m=+2376.344694850" Oct 06 12:48:15 crc kubenswrapper[4892]: I1006 12:48:15.610816 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:15 crc kubenswrapper[4892]: I1006 12:48:15.611537 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:15 crc kubenswrapper[4892]: I1006 12:48:15.681720 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:15 crc kubenswrapper[4892]: I1006 12:48:15.894077 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:15 crc kubenswrapper[4892]: I1006 12:48:15.952449 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhfg5"] Oct 06 12:48:17 crc kubenswrapper[4892]: I1006 12:48:17.861835 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qhfg5" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerName="registry-server" containerID="cri-o://3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5" gracePeriod=2 Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.434779 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.490963 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-utilities\") pod \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.491121 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-catalog-content\") pod \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.491244 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf7cp\" (UniqueName: \"kubernetes.io/projected/97129574-d37a-4fb6-835d-96fbcf1a2ec4-kube-api-access-qf7cp\") pod \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\" (UID: \"97129574-d37a-4fb6-835d-96fbcf1a2ec4\") " Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.491916 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-utilities" (OuterVolumeSpecName: "utilities") pod "97129574-d37a-4fb6-835d-96fbcf1a2ec4" (UID: "97129574-d37a-4fb6-835d-96fbcf1a2ec4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.497470 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97129574-d37a-4fb6-835d-96fbcf1a2ec4-kube-api-access-qf7cp" (OuterVolumeSpecName: "kube-api-access-qf7cp") pod "97129574-d37a-4fb6-835d-96fbcf1a2ec4" (UID: "97129574-d37a-4fb6-835d-96fbcf1a2ec4"). InnerVolumeSpecName "kube-api-access-qf7cp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.546062 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97129574-d37a-4fb6-835d-96fbcf1a2ec4" (UID: "97129574-d37a-4fb6-835d-96fbcf1a2ec4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.593901 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.593939 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf7cp\" (UniqueName: \"kubernetes.io/projected/97129574-d37a-4fb6-835d-96fbcf1a2ec4-kube-api-access-qf7cp\") on node \"crc\" DevicePath \"\"" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.593954 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97129574-d37a-4fb6-835d-96fbcf1a2ec4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.889835 4892 generic.go:334] "Generic (PLEG): container finished" podID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerID="3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5" exitCode=0 Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.889955 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhfg5" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.890024 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhfg5" event={"ID":"97129574-d37a-4fb6-835d-96fbcf1a2ec4","Type":"ContainerDied","Data":"3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5"} Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.890107 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhfg5" event={"ID":"97129574-d37a-4fb6-835d-96fbcf1a2ec4","Type":"ContainerDied","Data":"dee2e47b721f572c8c005672b1a91ac4ae3536c8a10f6f519204b163cb101272"} Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.890180 4892 scope.go:117] "RemoveContainer" containerID="3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.915744 4892 scope.go:117] "RemoveContainer" containerID="7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.930472 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhfg5"] Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.938687 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qhfg5"] Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.948605 4892 scope.go:117] "RemoveContainer" containerID="1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.996695 4892 scope.go:117] "RemoveContainer" containerID="3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5" Oct 06 12:48:18 crc kubenswrapper[4892]: E1006 12:48:18.997320 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5\": container with ID starting with 3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5 not found: ID does not exist" containerID="3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.997390 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5"} err="failed to get container status \"3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5\": rpc error: code = NotFound desc = could not find container \"3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5\": container with ID starting with 3e2cc70f09645335160cd40871236b262aba5a7520d11db505dbe8c3296c49e5 not found: ID does not exist" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.997425 4892 scope.go:117] "RemoveContainer" containerID="7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397" Oct 06 12:48:18 crc kubenswrapper[4892]: E1006 12:48:18.997797 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397\": container with ID starting with 7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397 not found: ID does not exist" containerID="7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.997836 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397"} err="failed to get container status \"7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397\": rpc error: code = NotFound desc = could not find container \"7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397\": container with ID starting with 7978a1c367c0ff495e8d0d941ccedc1de82448d5dcab845fc4632d9085755397 not found: ID does not exist" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.997858 4892 scope.go:117] "RemoveContainer" containerID="1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e" Oct 06 12:48:18 crc kubenswrapper[4892]: E1006 12:48:18.998249 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e\": container with ID starting with 1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e not found: ID does not exist" containerID="1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e" Oct 06 12:48:18 crc kubenswrapper[4892]: I1006 12:48:18.998289 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e"} err="failed to get container status \"1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e\": rpc error: code = NotFound desc = could not find container \"1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e\": container with ID starting with 1d89bc3060ee23184b60695f65c105f5fd480e135385689f7e9f3ba7dd28ca9e not found: ID does not exist" Oct 06 12:48:20 crc kubenswrapper[4892]: I1006 12:48:20.308622 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" path="/var/lib/kubelet/pods/97129574-d37a-4fb6-835d-96fbcf1a2ec4/volumes" Oct 06 12:48:22 crc kubenswrapper[4892]: I1006 12:48:22.984206 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:48:22 crc kubenswrapper[4892]: I1006 12:48:22.984642 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:48:52 crc kubenswrapper[4892]: I1006 12:48:52.984516 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:48:52 crc kubenswrapper[4892]: I1006 12:48:52.985309 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:48:52 crc kubenswrapper[4892]: I1006 12:48:52.985411 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:48:52 crc kubenswrapper[4892]: I1006 12:48:52.986510 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:48:52 crc kubenswrapper[4892]: I1006 12:48:52.986625 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" gracePeriod=600 Oct 06 12:48:53 crc kubenswrapper[4892]: E1006 12:48:53.123569 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:48:53 crc kubenswrapper[4892]: I1006 12:48:53.287850 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" exitCode=0 Oct 06 12:48:53 crc kubenswrapper[4892]: I1006 12:48:53.287914 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec"} Oct 06 12:48:53 crc kubenswrapper[4892]: I1006 12:48:53.287961 4892 scope.go:117] "RemoveContainer" containerID="aaf00a7af53f2aef2fe68e84f3f54bc362b13cdfe9c13a663cf8163c9b021ad5" Oct 06 12:48:53 crc kubenswrapper[4892]: I1006 12:48:53.289834 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:48:53 crc kubenswrapper[4892]: E1006 12:48:53.290232 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:49:07 crc kubenswrapper[4892]: I1006 12:49:07.168791 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:49:07 crc kubenswrapper[4892]: E1006 12:49:07.170155 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:49:09 crc kubenswrapper[4892]: I1006 12:49:09.475144 4892 generic.go:334] "Generic (PLEG): container finished" podID="aa2cf17a-4ca9-414f-9421-46fa314679d0" containerID="37e8ea72bbfffb2258b60084302aedfc8f033691f4e4f56002df08f84cff337c" exitCode=0 Oct 06 12:49:09 crc kubenswrapper[4892]: I1006 12:49:09.475165 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" event={"ID":"aa2cf17a-4ca9-414f-9421-46fa314679d0","Type":"ContainerDied","Data":"37e8ea72bbfffb2258b60084302aedfc8f033691f4e4f56002df08f84cff337c"} Oct 06 12:49:10 crc kubenswrapper[4892]: I1006 12:49:10.928507 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.021436 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-inventory\") pod \"aa2cf17a-4ca9-414f-9421-46fa314679d0\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.021712 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q62l5\" (UniqueName: \"kubernetes.io/projected/aa2cf17a-4ca9-414f-9421-46fa314679d0-kube-api-access-q62l5\") pod \"aa2cf17a-4ca9-414f-9421-46fa314679d0\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.021809 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-ssh-key\") pod \"aa2cf17a-4ca9-414f-9421-46fa314679d0\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.021934 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-combined-ca-bundle\") pod \"aa2cf17a-4ca9-414f-9421-46fa314679d0\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.022031 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-secret-0\") pod \"aa2cf17a-4ca9-414f-9421-46fa314679d0\" (UID: \"aa2cf17a-4ca9-414f-9421-46fa314679d0\") " Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.026729 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2cf17a-4ca9-414f-9421-46fa314679d0-kube-api-access-q62l5" (OuterVolumeSpecName: "kube-api-access-q62l5") pod "aa2cf17a-4ca9-414f-9421-46fa314679d0" (UID: "aa2cf17a-4ca9-414f-9421-46fa314679d0"). InnerVolumeSpecName "kube-api-access-q62l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.027187 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "aa2cf17a-4ca9-414f-9421-46fa314679d0" (UID: "aa2cf17a-4ca9-414f-9421-46fa314679d0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.057936 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "aa2cf17a-4ca9-414f-9421-46fa314679d0" (UID: "aa2cf17a-4ca9-414f-9421-46fa314679d0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.058730 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-inventory" (OuterVolumeSpecName: "inventory") pod "aa2cf17a-4ca9-414f-9421-46fa314679d0" (UID: "aa2cf17a-4ca9-414f-9421-46fa314679d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.060109 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa2cf17a-4ca9-414f-9421-46fa314679d0" (UID: "aa2cf17a-4ca9-414f-9421-46fa314679d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.124169 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.124203 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q62l5\" (UniqueName: \"kubernetes.io/projected/aa2cf17a-4ca9-414f-9421-46fa314679d0-kube-api-access-q62l5\") on node \"crc\" DevicePath \"\"" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.124216 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.124225 4892 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.124235 4892 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/aa2cf17a-4ca9-414f-9421-46fa314679d0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.503866 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" event={"ID":"aa2cf17a-4ca9-414f-9421-46fa314679d0","Type":"ContainerDied","Data":"df01bf2569fca879c0cef8dde6cb292cdf75e28f986afa6e5126f306aa110046"} Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.503931 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df01bf2569fca879c0cef8dde6cb292cdf75e28f986afa6e5126f306aa110046" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.504372 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6njgq" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.650920 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6"] Oct 06 12:49:11 crc kubenswrapper[4892]: E1006 12:49:11.651419 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerName="extract-content" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.651440 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerName="extract-content" Oct 06 12:49:11 crc kubenswrapper[4892]: E1006 12:49:11.651472 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerName="extract-utilities" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.651482 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerName="extract-utilities" Oct 06 12:49:11 crc kubenswrapper[4892]: E1006 12:49:11.651512 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerName="registry-server" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.651521 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerName="registry-server" Oct 06 12:49:11 crc kubenswrapper[4892]: E1006 12:49:11.651540 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2cf17a-4ca9-414f-9421-46fa314679d0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.651548 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2cf17a-4ca9-414f-9421-46fa314679d0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.651798 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2cf17a-4ca9-414f-9421-46fa314679d0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.651829 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="97129574-d37a-4fb6-835d-96fbcf1a2ec4" containerName="registry-server" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.652720 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.656124 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.656392 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.656589 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.656605 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.656625 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.657945 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.664072 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.668304 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6"] Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845295 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845369 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845400 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845441 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6fr8\" (UniqueName: \"kubernetes.io/projected/06eb266c-79fd-49cd-9071-1ed4446e94d6-kube-api-access-v6fr8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845463 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845582 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845677 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845717 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.845753 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.947521 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.948174 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.948266 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.948404 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.948506 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.948587 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.948681 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6fr8\" (UniqueName: \"kubernetes.io/projected/06eb266c-79fd-49cd-9071-1ed4446e94d6-kube-api-access-v6fr8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.948757 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.948860 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.950642 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.954623 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.954653 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.956455 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.960540 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.960878 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.966738 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.966957 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.971973 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6fr8\" (UniqueName: \"kubernetes.io/projected/06eb266c-79fd-49cd-9071-1ed4446e94d6-kube-api-access-v6fr8\") pod \"nova-edpm-deployment-openstack-edpm-ipam-rjll6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:11 crc kubenswrapper[4892]: I1006 12:49:11.974044 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:49:12 crc kubenswrapper[4892]: I1006 12:49:12.488935 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6"] Oct 06 12:49:12 crc kubenswrapper[4892]: W1006 12:49:12.492849 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06eb266c_79fd_49cd_9071_1ed4446e94d6.slice/crio-81982d7143f6daf7b87c27dce921fe0e2d2969f118cc6d7ce1c00d934a7827c2 WatchSource:0}: Error finding container 81982d7143f6daf7b87c27dce921fe0e2d2969f118cc6d7ce1c00d934a7827c2: Status 404 returned error can't find the container with id 81982d7143f6daf7b87c27dce921fe0e2d2969f118cc6d7ce1c00d934a7827c2 Oct 06 12:49:12 crc kubenswrapper[4892]: I1006 12:49:12.514283 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" event={"ID":"06eb266c-79fd-49cd-9071-1ed4446e94d6","Type":"ContainerStarted","Data":"81982d7143f6daf7b87c27dce921fe0e2d2969f118cc6d7ce1c00d934a7827c2"} Oct 06 12:49:14 crc kubenswrapper[4892]: I1006 12:49:14.540991 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" event={"ID":"06eb266c-79fd-49cd-9071-1ed4446e94d6","Type":"ContainerStarted","Data":"72a7d694c4005175b206a07d6e72eaa6db3edc86f78f7e5c357bb02402f0a6c4"} Oct 06 12:49:20 crc kubenswrapper[4892]: I1006 12:49:20.168776 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:49:20 crc kubenswrapper[4892]: E1006 12:49:20.170105 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:49:34 crc kubenswrapper[4892]: I1006 12:49:34.181600 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:49:34 crc kubenswrapper[4892]: E1006 12:49:34.182400 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:49:48 crc kubenswrapper[4892]: I1006 12:49:48.175351 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:49:48 crc kubenswrapper[4892]: E1006 12:49:48.176684 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:49:59 crc kubenswrapper[4892]: I1006 12:49:59.168859 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:49:59 crc kubenswrapper[4892]: E1006 12:49:59.169927 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:50:13 crc kubenswrapper[4892]: I1006 12:50:13.168996 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:50:13 crc kubenswrapper[4892]: E1006 12:50:13.169832 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:50:27 crc kubenswrapper[4892]: I1006 12:50:27.168859 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:50:27 crc kubenswrapper[4892]: E1006 12:50:27.169457 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:50:39 crc kubenswrapper[4892]: I1006 12:50:39.169830 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:50:39 crc kubenswrapper[4892]: E1006 12:50:39.170597 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:50:54 crc kubenswrapper[4892]: I1006 12:50:54.175428 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:50:54 crc kubenswrapper[4892]: E1006 12:50:54.176413 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:51:05 crc kubenswrapper[4892]: I1006 12:51:05.170196 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:51:05 crc kubenswrapper[4892]: E1006 12:51:05.171623 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:51:18 crc kubenswrapper[4892]: I1006 12:51:18.169100 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:51:18 crc kubenswrapper[4892]: E1006 12:51:18.169876 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:51:31 crc kubenswrapper[4892]: I1006 12:51:31.169467 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:51:31 crc kubenswrapper[4892]: E1006 12:51:31.170729 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:51:45 crc kubenswrapper[4892]: I1006 12:51:45.170026 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:51:45 crc kubenswrapper[4892]: E1006 12:51:45.171068 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:52:00 crc kubenswrapper[4892]: I1006 12:52:00.168631 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:52:00 crc kubenswrapper[4892]: E1006 12:52:00.170002 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:52:08 crc kubenswrapper[4892]: I1006 12:52:08.958729 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" podStartSLOduration=177.183007648 podStartE2EDuration="2m57.958704388s" podCreationTimestamp="2025-10-06 12:49:11 +0000 UTC" firstStartedPulling="2025-10-06 12:49:12.496419464 +0000 UTC m=+2439.046125219" lastFinishedPulling="2025-10-06 12:49:13.272116194 +0000 UTC m=+2439.821821959" observedRunningTime="2025-10-06 12:49:14.572009444 +0000 UTC m=+2441.121715239" watchObservedRunningTime="2025-10-06 12:52:08.958704388 +0000 UTC m=+2615.508410153" Oct 06 12:52:08 crc kubenswrapper[4892]: I1006 12:52:08.966127 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dd2r2"] Oct 06 12:52:08 crc kubenswrapper[4892]: I1006 12:52:08.969458 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dd2r2"] Oct 06 12:52:08 crc kubenswrapper[4892]: I1006 12:52:08.969558 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.107671 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ffn\" (UniqueName: \"kubernetes.io/projected/2c53bf57-0c53-43e0-90b8-ca1c8245f859-kube-api-access-z7ffn\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.107752 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-catalog-content\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.107775 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-utilities\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.210092 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7ffn\" (UniqueName: \"kubernetes.io/projected/2c53bf57-0c53-43e0-90b8-ca1c8245f859-kube-api-access-z7ffn\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.210173 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-catalog-content\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.210193 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-utilities\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.210732 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-utilities\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.210784 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-catalog-content\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.236599 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7ffn\" (UniqueName: \"kubernetes.io/projected/2c53bf57-0c53-43e0-90b8-ca1c8245f859-kube-api-access-z7ffn\") pod \"redhat-operators-dd2r2\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.291814 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:09 crc kubenswrapper[4892]: I1006 12:52:09.793924 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dd2r2"] Oct 06 12:52:10 crc kubenswrapper[4892]: I1006 12:52:10.560490 4892 generic.go:334] "Generic (PLEG): container finished" podID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerID="97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e" exitCode=0 Oct 06 12:52:10 crc kubenswrapper[4892]: I1006 12:52:10.560686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd2r2" event={"ID":"2c53bf57-0c53-43e0-90b8-ca1c8245f859","Type":"ContainerDied","Data":"97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e"} Oct 06 12:52:10 crc kubenswrapper[4892]: I1006 12:52:10.560848 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd2r2" event={"ID":"2c53bf57-0c53-43e0-90b8-ca1c8245f859","Type":"ContainerStarted","Data":"8542beeddc435c026fa02b4496acaeb90aa6339513dc3a3d86ddc43b68e6308a"} Oct 06 12:52:12 crc kubenswrapper[4892]: I1006 12:52:12.586515 4892 generic.go:334] "Generic (PLEG): container finished" podID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerID="d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d" exitCode=0 Oct 06 12:52:12 crc kubenswrapper[4892]: I1006 12:52:12.586631 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd2r2" event={"ID":"2c53bf57-0c53-43e0-90b8-ca1c8245f859","Type":"ContainerDied","Data":"d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d"} Oct 06 12:52:14 crc kubenswrapper[4892]: I1006 12:52:14.620413 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd2r2" event={"ID":"2c53bf57-0c53-43e0-90b8-ca1c8245f859","Type":"ContainerStarted","Data":"32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315"} Oct 06 12:52:15 crc kubenswrapper[4892]: I1006 12:52:15.168953 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:52:15 crc kubenswrapper[4892]: E1006 12:52:15.169372 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:52:19 crc kubenswrapper[4892]: I1006 12:52:19.292419 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:19 crc kubenswrapper[4892]: I1006 12:52:19.293010 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:19 crc kubenswrapper[4892]: I1006 12:52:19.372269 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:19 crc kubenswrapper[4892]: I1006 12:52:19.395405 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dd2r2" podStartSLOduration=8.578425696 podStartE2EDuration="11.395384933s" podCreationTimestamp="2025-10-06 12:52:08 +0000 UTC" firstStartedPulling="2025-10-06 12:52:10.56246779 +0000 UTC m=+2617.112173575" lastFinishedPulling="2025-10-06 12:52:13.379427007 +0000 UTC m=+2619.929132812" observedRunningTime="2025-10-06 12:52:14.640832794 +0000 UTC m=+2621.190538599" watchObservedRunningTime="2025-10-06 12:52:19.395384933 +0000 UTC m=+2625.945090688" Oct 06 12:52:19 crc kubenswrapper[4892]: I1006 12:52:19.776779 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:19 crc kubenswrapper[4892]: I1006 12:52:19.834190 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dd2r2"] Oct 06 12:52:21 crc kubenswrapper[4892]: I1006 12:52:21.710512 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dd2r2" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerName="registry-server" containerID="cri-o://32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315" gracePeriod=2 Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.224906 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.309694 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-utilities\") pod \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.309891 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7ffn\" (UniqueName: \"kubernetes.io/projected/2c53bf57-0c53-43e0-90b8-ca1c8245f859-kube-api-access-z7ffn\") pod \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.310051 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-catalog-content\") pod \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\" (UID: \"2c53bf57-0c53-43e0-90b8-ca1c8245f859\") " Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.311709 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-utilities" (OuterVolumeSpecName: "utilities") pod "2c53bf57-0c53-43e0-90b8-ca1c8245f859" (UID: "2c53bf57-0c53-43e0-90b8-ca1c8245f859"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.315225 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c53bf57-0c53-43e0-90b8-ca1c8245f859-kube-api-access-z7ffn" (OuterVolumeSpecName: "kube-api-access-z7ffn") pod "2c53bf57-0c53-43e0-90b8-ca1c8245f859" (UID: "2c53bf57-0c53-43e0-90b8-ca1c8245f859"). InnerVolumeSpecName "kube-api-access-z7ffn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.413870 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.413910 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7ffn\" (UniqueName: \"kubernetes.io/projected/2c53bf57-0c53-43e0-90b8-ca1c8245f859-kube-api-access-z7ffn\") on node \"crc\" DevicePath \"\"" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.725212 4892 generic.go:334] "Generic (PLEG): container finished" podID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerID="32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315" exitCode=0 Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.725301 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dd2r2" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.725275 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd2r2" event={"ID":"2c53bf57-0c53-43e0-90b8-ca1c8245f859","Type":"ContainerDied","Data":"32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315"} Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.725464 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dd2r2" event={"ID":"2c53bf57-0c53-43e0-90b8-ca1c8245f859","Type":"ContainerDied","Data":"8542beeddc435c026fa02b4496acaeb90aa6339513dc3a3d86ddc43b68e6308a"} Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.725505 4892 scope.go:117] "RemoveContainer" containerID="32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.751417 4892 scope.go:117] "RemoveContainer" containerID="d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.778780 4892 scope.go:117] "RemoveContainer" containerID="97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.871636 4892 scope.go:117] "RemoveContainer" containerID="32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315" Oct 06 12:52:22 crc kubenswrapper[4892]: E1006 12:52:22.873061 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315\": container with ID starting with 32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315 not found: ID does not exist" containerID="32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.873108 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315"} err="failed to get container status \"32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315\": rpc error: code = NotFound desc = could not find container \"32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315\": container with ID starting with 32c6048cf2486d619745d36da99ae70a0bee1c76eb245c7bfc5a58279e3cb315 not found: ID does not exist" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.873137 4892 scope.go:117] "RemoveContainer" containerID="d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d" Oct 06 12:52:22 crc kubenswrapper[4892]: E1006 12:52:22.873666 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d\": container with ID starting with d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d not found: ID does not exist" containerID="d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.873716 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d"} err="failed to get container status \"d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d\": rpc error: code = NotFound desc = could not find container \"d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d\": container with ID starting with d838d6f216925476fecb56137a91e9d976db8ba59fbec82c6f9389963353ed3d not found: ID does not exist" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.873751 4892 scope.go:117] "RemoveContainer" containerID="97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e" Oct 06 12:52:22 crc kubenswrapper[4892]: E1006 12:52:22.874282 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e\": container with ID starting with 97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e not found: ID does not exist" containerID="97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e" Oct 06 12:52:22 crc kubenswrapper[4892]: I1006 12:52:22.874363 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e"} err="failed to get container status \"97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e\": rpc error: code = NotFound desc = could not find container \"97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e\": container with ID starting with 97862c38802ce889b4c8baf406c5028c9ad67369ff36dbcbeb1e75d21389a99e not found: ID does not exist" Oct 06 12:52:23 crc kubenswrapper[4892]: I1006 12:52:23.496352 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c53bf57-0c53-43e0-90b8-ca1c8245f859" (UID: "2c53bf57-0c53-43e0-90b8-ca1c8245f859"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:52:23 crc kubenswrapper[4892]: I1006 12:52:23.540202 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c53bf57-0c53-43e0-90b8-ca1c8245f859-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:52:23 crc kubenswrapper[4892]: I1006 12:52:23.673099 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dd2r2"] Oct 06 12:52:23 crc kubenswrapper[4892]: I1006 12:52:23.683269 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dd2r2"] Oct 06 12:52:24 crc kubenswrapper[4892]: I1006 12:52:24.184794 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" path="/var/lib/kubelet/pods/2c53bf57-0c53-43e0-90b8-ca1c8245f859/volumes" Oct 06 12:52:30 crc kubenswrapper[4892]: I1006 12:52:30.169671 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:52:30 crc kubenswrapper[4892]: E1006 12:52:30.170608 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:52:45 crc kubenswrapper[4892]: I1006 12:52:45.171347 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:52:45 crc kubenswrapper[4892]: E1006 12:52:45.172212 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:52:58 crc kubenswrapper[4892]: I1006 12:52:58.168560 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:52:58 crc kubenswrapper[4892]: E1006 12:52:58.169407 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:52:59 crc kubenswrapper[4892]: I1006 12:52:59.148928 4892 generic.go:334] "Generic (PLEG): container finished" podID="06eb266c-79fd-49cd-9071-1ed4446e94d6" containerID="72a7d694c4005175b206a07d6e72eaa6db3edc86f78f7e5c357bb02402f0a6c4" exitCode=0 Oct 06 12:52:59 crc kubenswrapper[4892]: I1006 12:52:59.149037 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" event={"ID":"06eb266c-79fd-49cd-9071-1ed4446e94d6","Type":"ContainerDied","Data":"72a7d694c4005175b206a07d6e72eaa6db3edc86f78f7e5c357bb02402f0a6c4"} Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.608838 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.760831 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-inventory\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.760974 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-extra-config-0\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.761003 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-0\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.761022 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-1\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.761049 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-0\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.761100 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6fr8\" (UniqueName: \"kubernetes.io/projected/06eb266c-79fd-49cd-9071-1ed4446e94d6-kube-api-access-v6fr8\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.761126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-1\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.761158 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-ssh-key\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.766423 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06eb266c-79fd-49cd-9071-1ed4446e94d6-kube-api-access-v6fr8" (OuterVolumeSpecName: "kube-api-access-v6fr8") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "kube-api-access-v6fr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.768235 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-combined-ca-bundle\") pod \"06eb266c-79fd-49cd-9071-1ed4446e94d6\" (UID: \"06eb266c-79fd-49cd-9071-1ed4446e94d6\") " Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.769169 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6fr8\" (UniqueName: \"kubernetes.io/projected/06eb266c-79fd-49cd-9071-1ed4446e94d6-kube-api-access-v6fr8\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.772135 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.799417 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.799746 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.801446 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-inventory" (OuterVolumeSpecName: "inventory") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.803384 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.810969 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.814502 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.814833 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "06eb266c-79fd-49cd-9071-1ed4446e94d6" (UID: "06eb266c-79fd-49cd-9071-1ed4446e94d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.871450 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.871509 4892 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.871524 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.871537 4892 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.871547 4892 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.871559 4892 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.871570 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:00 crc kubenswrapper[4892]: I1006 12:53:00.871580 4892 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eb266c-79fd-49cd-9071-1ed4446e94d6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.169559 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" event={"ID":"06eb266c-79fd-49cd-9071-1ed4446e94d6","Type":"ContainerDied","Data":"81982d7143f6daf7b87c27dce921fe0e2d2969f118cc6d7ce1c00d934a7827c2"} Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.169602 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81982d7143f6daf7b87c27dce921fe0e2d2969f118cc6d7ce1c00d934a7827c2" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.169613 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-rjll6" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.318308 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f"] Oct 06 12:53:01 crc kubenswrapper[4892]: E1006 12:53:01.318747 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerName="extract-content" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.318762 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerName="extract-content" Oct 06 12:53:01 crc kubenswrapper[4892]: E1006 12:53:01.318786 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerName="extract-utilities" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.318793 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerName="extract-utilities" Oct 06 12:53:01 crc kubenswrapper[4892]: E1006 12:53:01.318806 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerName="registry-server" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.318812 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerName="registry-server" Oct 06 12:53:01 crc kubenswrapper[4892]: E1006 12:53:01.318839 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06eb266c-79fd-49cd-9071-1ed4446e94d6" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.318845 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="06eb266c-79fd-49cd-9071-1ed4446e94d6" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.319031 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c53bf57-0c53-43e0-90b8-ca1c8245f859" containerName="registry-server" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.319048 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="06eb266c-79fd-49cd-9071-1ed4446e94d6" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.319754 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.325436 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kck66" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.325454 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.325444 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.325562 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.325783 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.329485 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f"] Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.379973 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.380053 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.380111 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.380133 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.380156 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.380187 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.380214 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pc4p\" (UniqueName: \"kubernetes.io/projected/ab077a9a-134b-497a-abce-777fb1303160-kube-api-access-7pc4p\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.481881 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pc4p\" (UniqueName: \"kubernetes.io/projected/ab077a9a-134b-497a-abce-777fb1303160-kube-api-access-7pc4p\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.482049 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.482109 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.482164 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.482188 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.482218 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.482237 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.486913 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.487891 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.490195 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.492755 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.495049 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.495551 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.514782 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pc4p\" (UniqueName: \"kubernetes.io/projected/ab077a9a-134b-497a-abce-777fb1303160-kube-api-access-7pc4p\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:01 crc kubenswrapper[4892]: I1006 12:53:01.636269 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:53:02 crc kubenswrapper[4892]: W1006 12:53:02.188349 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab077a9a_134b_497a_abce_777fb1303160.slice/crio-276f05e2b16a9985e4c9f8259636b62b4db215819beecb7b11d8b92c12cc9d82 WatchSource:0}: Error finding container 276f05e2b16a9985e4c9f8259636b62b4db215819beecb7b11d8b92c12cc9d82: Status 404 returned error can't find the container with id 276f05e2b16a9985e4c9f8259636b62b4db215819beecb7b11d8b92c12cc9d82 Oct 06 12:53:02 crc kubenswrapper[4892]: I1006 12:53:02.201258 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f"] Oct 06 12:53:03 crc kubenswrapper[4892]: I1006 12:53:03.200809 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" event={"ID":"ab077a9a-134b-497a-abce-777fb1303160","Type":"ContainerStarted","Data":"42f9cfd958d7260420bdfa02db2f1ed8040ce0e73e4d1626551da5ad96f10794"} Oct 06 12:53:03 crc kubenswrapper[4892]: I1006 12:53:03.201166 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" event={"ID":"ab077a9a-134b-497a-abce-777fb1303160","Type":"ContainerStarted","Data":"276f05e2b16a9985e4c9f8259636b62b4db215819beecb7b11d8b92c12cc9d82"} Oct 06 12:53:03 crc kubenswrapper[4892]: I1006 12:53:03.232913 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" podStartSLOduration=1.6713765409999999 podStartE2EDuration="2.23289041s" podCreationTimestamp="2025-10-06 12:53:01 +0000 UTC" firstStartedPulling="2025-10-06 12:53:02.191124072 +0000 UTC m=+2668.740829837" lastFinishedPulling="2025-10-06 12:53:02.752637941 +0000 UTC m=+2669.302343706" observedRunningTime="2025-10-06 12:53:03.221295858 +0000 UTC m=+2669.771001633" watchObservedRunningTime="2025-10-06 12:53:03.23289041 +0000 UTC m=+2669.782596185" Oct 06 12:53:11 crc kubenswrapper[4892]: I1006 12:53:11.169634 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:53:11 crc kubenswrapper[4892]: E1006 12:53:11.170680 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:53:26 crc kubenswrapper[4892]: I1006 12:53:26.168982 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:53:26 crc kubenswrapper[4892]: E1006 12:53:26.169794 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:53:40 crc kubenswrapper[4892]: I1006 12:53:40.169408 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:53:40 crc kubenswrapper[4892]: E1006 12:53:40.170833 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 12:53:53 crc kubenswrapper[4892]: I1006 12:53:53.169136 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:53:53 crc kubenswrapper[4892]: I1006 12:53:53.779410 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"d194753ef4190e3702b4f0846aa3401fc1db7562b22faa07ffe712a7046afbbf"} Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.255621 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q77z8"] Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.259918 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.285087 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q77z8"] Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.297704 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptmqc\" (UniqueName: \"kubernetes.io/projected/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-kube-api-access-ptmqc\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.297937 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-catalog-content\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.298047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-utilities\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.401535 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptmqc\" (UniqueName: \"kubernetes.io/projected/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-kube-api-access-ptmqc\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.401634 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-catalog-content\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.401696 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-utilities\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.402525 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-utilities\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.402658 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-catalog-content\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.428638 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptmqc\" (UniqueName: \"kubernetes.io/projected/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-kube-api-access-ptmqc\") pod \"certified-operators-q77z8\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:55 crc kubenswrapper[4892]: I1006 12:54:55.593029 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:54:56 crc kubenswrapper[4892]: I1006 12:54:56.104914 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q77z8"] Oct 06 12:54:56 crc kubenswrapper[4892]: I1006 12:54:56.530850 4892 generic.go:334] "Generic (PLEG): container finished" podID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerID="bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529" exitCode=0 Oct 06 12:54:56 crc kubenswrapper[4892]: I1006 12:54:56.530971 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77z8" event={"ID":"1d393d08-49bb-4fea-ab58-1c905d1c5b6b","Type":"ContainerDied","Data":"bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529"} Oct 06 12:54:56 crc kubenswrapper[4892]: I1006 12:54:56.531409 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77z8" event={"ID":"1d393d08-49bb-4fea-ab58-1c905d1c5b6b","Type":"ContainerStarted","Data":"374236e1b8bc911284fdd23753919e777cb4bf974d2218ae1be394705195f12c"} Oct 06 12:54:56 crc kubenswrapper[4892]: I1006 12:54:56.532885 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 12:54:58 crc kubenswrapper[4892]: I1006 12:54:58.564434 4892 generic.go:334] "Generic (PLEG): container finished" podID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerID="832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471" exitCode=0 Oct 06 12:54:58 crc kubenswrapper[4892]: I1006 12:54:58.564516 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77z8" event={"ID":"1d393d08-49bb-4fea-ab58-1c905d1c5b6b","Type":"ContainerDied","Data":"832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471"} Oct 06 12:54:59 crc kubenswrapper[4892]: I1006 12:54:59.577744 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77z8" event={"ID":"1d393d08-49bb-4fea-ab58-1c905d1c5b6b","Type":"ContainerStarted","Data":"a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334"} Oct 06 12:54:59 crc kubenswrapper[4892]: I1006 12:54:59.607666 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q77z8" podStartSLOduration=2.139445523 podStartE2EDuration="4.60763999s" podCreationTimestamp="2025-10-06 12:54:55 +0000 UTC" firstStartedPulling="2025-10-06 12:54:56.532600894 +0000 UTC m=+2783.082306669" lastFinishedPulling="2025-10-06 12:54:59.000795361 +0000 UTC m=+2785.550501136" observedRunningTime="2025-10-06 12:54:59.602140853 +0000 UTC m=+2786.151846628" watchObservedRunningTime="2025-10-06 12:54:59.60763999 +0000 UTC m=+2786.157345795" Oct 06 12:55:05 crc kubenswrapper[4892]: I1006 12:55:05.593422 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:55:05 crc kubenswrapper[4892]: I1006 12:55:05.593968 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:55:05 crc kubenswrapper[4892]: I1006 12:55:05.679977 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:55:05 crc kubenswrapper[4892]: I1006 12:55:05.763513 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:55:05 crc kubenswrapper[4892]: I1006 12:55:05.927527 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q77z8"] Oct 06 12:55:07 crc kubenswrapper[4892]: I1006 12:55:07.672968 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q77z8" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerName="registry-server" containerID="cri-o://a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334" gracePeriod=2 Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.177366 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.326845 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-utilities\") pod \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.326976 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-catalog-content\") pod \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.327143 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptmqc\" (UniqueName: \"kubernetes.io/projected/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-kube-api-access-ptmqc\") pod \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\" (UID: \"1d393d08-49bb-4fea-ab58-1c905d1c5b6b\") " Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.327805 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-utilities" (OuterVolumeSpecName: "utilities") pod "1d393d08-49bb-4fea-ab58-1c905d1c5b6b" (UID: "1d393d08-49bb-4fea-ab58-1c905d1c5b6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.334268 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-kube-api-access-ptmqc" (OuterVolumeSpecName: "kube-api-access-ptmqc") pod "1d393d08-49bb-4fea-ab58-1c905d1c5b6b" (UID: "1d393d08-49bb-4fea-ab58-1c905d1c5b6b"). InnerVolumeSpecName "kube-api-access-ptmqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.369498 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d393d08-49bb-4fea-ab58-1c905d1c5b6b" (UID: "1d393d08-49bb-4fea-ab58-1c905d1c5b6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.429773 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptmqc\" (UniqueName: \"kubernetes.io/projected/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-kube-api-access-ptmqc\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.429827 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.429843 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d393d08-49bb-4fea-ab58-1c905d1c5b6b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.687231 4892 generic.go:334] "Generic (PLEG): container finished" podID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerID="a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334" exitCode=0 Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.687274 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77z8" event={"ID":"1d393d08-49bb-4fea-ab58-1c905d1c5b6b","Type":"ContainerDied","Data":"a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334"} Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.687298 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q77z8" event={"ID":"1d393d08-49bb-4fea-ab58-1c905d1c5b6b","Type":"ContainerDied","Data":"374236e1b8bc911284fdd23753919e777cb4bf974d2218ae1be394705195f12c"} Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.687316 4892 scope.go:117] "RemoveContainer" containerID="a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.687456 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q77z8" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.737909 4892 scope.go:117] "RemoveContainer" containerID="832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.746650 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q77z8"] Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.762702 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q77z8"] Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.779984 4892 scope.go:117] "RemoveContainer" containerID="bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.852713 4892 scope.go:117] "RemoveContainer" containerID="a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334" Oct 06 12:55:08 crc kubenswrapper[4892]: E1006 12:55:08.853664 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334\": container with ID starting with a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334 not found: ID does not exist" containerID="a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.853712 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334"} err="failed to get container status \"a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334\": rpc error: code = NotFound desc = could not find container \"a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334\": container with ID starting with a4cd0a4e23d750c0252f7da70a44e41b37f353d87ad3953df959f1eceef58334 not found: ID does not exist" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.853741 4892 scope.go:117] "RemoveContainer" containerID="832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471" Oct 06 12:55:08 crc kubenswrapper[4892]: E1006 12:55:08.854264 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471\": container with ID starting with 832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471 not found: ID does not exist" containerID="832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.854930 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471"} err="failed to get container status \"832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471\": rpc error: code = NotFound desc = could not find container \"832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471\": container with ID starting with 832e5fd38769dea1e6b95d14a54d63e8123bc01717d5eee0f485cfb830f01471 not found: ID does not exist" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.854994 4892 scope.go:117] "RemoveContainer" containerID="bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529" Oct 06 12:55:08 crc kubenswrapper[4892]: E1006 12:55:08.855428 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529\": container with ID starting with bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529 not found: ID does not exist" containerID="bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529" Oct 06 12:55:08 crc kubenswrapper[4892]: I1006 12:55:08.855476 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529"} err="failed to get container status \"bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529\": rpc error: code = NotFound desc = could not find container \"bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529\": container with ID starting with bf9e3078dece321e512f1122ba2debd0d423e431dc0232a058c6c038f11d1529 not found: ID does not exist" Oct 06 12:55:10 crc kubenswrapper[4892]: I1006 12:55:10.192906 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" path="/var/lib/kubelet/pods/1d393d08-49bb-4fea-ab58-1c905d1c5b6b/volumes" Oct 06 12:55:40 crc kubenswrapper[4892]: I1006 12:55:40.081812 4892 generic.go:334] "Generic (PLEG): container finished" podID="ab077a9a-134b-497a-abce-777fb1303160" containerID="42f9cfd958d7260420bdfa02db2f1ed8040ce0e73e4d1626551da5ad96f10794" exitCode=0 Oct 06 12:55:40 crc kubenswrapper[4892]: I1006 12:55:40.081896 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" event={"ID":"ab077a9a-134b-497a-abce-777fb1303160","Type":"ContainerDied","Data":"42f9cfd958d7260420bdfa02db2f1ed8040ce0e73e4d1626551da5ad96f10794"} Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.628945 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.694696 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-inventory\") pod \"ab077a9a-134b-497a-abce-777fb1303160\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.694857 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pc4p\" (UniqueName: \"kubernetes.io/projected/ab077a9a-134b-497a-abce-777fb1303160-kube-api-access-7pc4p\") pod \"ab077a9a-134b-497a-abce-777fb1303160\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.694993 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-2\") pod \"ab077a9a-134b-497a-abce-777fb1303160\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.695077 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-1\") pod \"ab077a9a-134b-497a-abce-777fb1303160\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.695190 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ssh-key\") pod \"ab077a9a-134b-497a-abce-777fb1303160\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.695270 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-0\") pod \"ab077a9a-134b-497a-abce-777fb1303160\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.695489 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-telemetry-combined-ca-bundle\") pod \"ab077a9a-134b-497a-abce-777fb1303160\" (UID: \"ab077a9a-134b-497a-abce-777fb1303160\") " Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.703435 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ab077a9a-134b-497a-abce-777fb1303160" (UID: "ab077a9a-134b-497a-abce-777fb1303160"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.703695 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab077a9a-134b-497a-abce-777fb1303160-kube-api-access-7pc4p" (OuterVolumeSpecName: "kube-api-access-7pc4p") pod "ab077a9a-134b-497a-abce-777fb1303160" (UID: "ab077a9a-134b-497a-abce-777fb1303160"). InnerVolumeSpecName "kube-api-access-7pc4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.737457 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ab077a9a-134b-497a-abce-777fb1303160" (UID: "ab077a9a-134b-497a-abce-777fb1303160"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.740541 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ab077a9a-134b-497a-abce-777fb1303160" (UID: "ab077a9a-134b-497a-abce-777fb1303160"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.742988 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-inventory" (OuterVolumeSpecName: "inventory") pod "ab077a9a-134b-497a-abce-777fb1303160" (UID: "ab077a9a-134b-497a-abce-777fb1303160"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.751521 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ab077a9a-134b-497a-abce-777fb1303160" (UID: "ab077a9a-134b-497a-abce-777fb1303160"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.763253 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ab077a9a-134b-497a-abce-777fb1303160" (UID: "ab077a9a-134b-497a-abce-777fb1303160"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.799752 4892 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.799813 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.799844 4892 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.799866 4892 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-inventory\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.799884 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pc4p\" (UniqueName: \"kubernetes.io/projected/ab077a9a-134b-497a-abce-777fb1303160-kube-api-access-7pc4p\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.799903 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:41 crc kubenswrapper[4892]: I1006 12:55:41.799922 4892 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ab077a9a-134b-497a-abce-777fb1303160-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 06 12:55:42 crc kubenswrapper[4892]: I1006 12:55:42.107012 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" event={"ID":"ab077a9a-134b-497a-abce-777fb1303160","Type":"ContainerDied","Data":"276f05e2b16a9985e4c9f8259636b62b4db215819beecb7b11d8b92c12cc9d82"} Oct 06 12:55:42 crc kubenswrapper[4892]: I1006 12:55:42.107746 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="276f05e2b16a9985e4c9f8259636b62b4db215819beecb7b11d8b92c12cc9d82" Oct 06 12:55:42 crc kubenswrapper[4892]: I1006 12:55:42.107130 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.511803 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bdxj7"] Oct 06 12:56:14 crc kubenswrapper[4892]: E1006 12:56:14.512946 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerName="registry-server" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.512968 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerName="registry-server" Oct 06 12:56:14 crc kubenswrapper[4892]: E1006 12:56:14.513010 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerName="extract-content" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.513023 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerName="extract-content" Oct 06 12:56:14 crc kubenswrapper[4892]: E1006 12:56:14.513057 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab077a9a-134b-497a-abce-777fb1303160" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.513073 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab077a9a-134b-497a-abce-777fb1303160" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 12:56:14 crc kubenswrapper[4892]: E1006 12:56:14.513092 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerName="extract-utilities" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.513104 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerName="extract-utilities" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.513507 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d393d08-49bb-4fea-ab58-1c905d1c5b6b" containerName="registry-server" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.513532 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab077a9a-134b-497a-abce-777fb1303160" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.516006 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.526390 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdxj7"] Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.636419 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-catalog-content\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.636964 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xvmg\" (UniqueName: \"kubernetes.io/projected/1b2d464c-25a8-48fb-b695-45486b4257cf-kube-api-access-5xvmg\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.637139 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-utilities\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.739970 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-catalog-content\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.740128 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xvmg\" (UniqueName: \"kubernetes.io/projected/1b2d464c-25a8-48fb-b695-45486b4257cf-kube-api-access-5xvmg\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.740177 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-utilities\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.740967 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-catalog-content\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.741032 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-utilities\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.780463 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xvmg\" (UniqueName: \"kubernetes.io/projected/1b2d464c-25a8-48fb-b695-45486b4257cf-kube-api-access-5xvmg\") pod \"redhat-marketplace-bdxj7\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:14 crc kubenswrapper[4892]: I1006 12:56:14.852645 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:15 crc kubenswrapper[4892]: I1006 12:56:15.361984 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdxj7"] Oct 06 12:56:15 crc kubenswrapper[4892]: I1006 12:56:15.497194 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdxj7" event={"ID":"1b2d464c-25a8-48fb-b695-45486b4257cf","Type":"ContainerStarted","Data":"2f60304f6d4b20dc75a9bc51f9a326604b9e4fc6c007b21ae9a4bbc3ba8741a8"} Oct 06 12:56:15 crc kubenswrapper[4892]: E1006 12:56:15.821305 4892 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b2d464c_25a8_48fb_b695_45486b4257cf.slice/crio-conmon-0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4.scope\": RecentStats: unable to find data in memory cache]" Oct 06 12:56:16 crc kubenswrapper[4892]: I1006 12:56:16.509298 4892 generic.go:334] "Generic (PLEG): container finished" podID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerID="0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4" exitCode=0 Oct 06 12:56:16 crc kubenswrapper[4892]: I1006 12:56:16.509538 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdxj7" event={"ID":"1b2d464c-25a8-48fb-b695-45486b4257cf","Type":"ContainerDied","Data":"0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4"} Oct 06 12:56:17 crc kubenswrapper[4892]: I1006 12:56:17.519983 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdxj7" event={"ID":"1b2d464c-25a8-48fb-b695-45486b4257cf","Type":"ContainerStarted","Data":"b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b"} Oct 06 12:56:18 crc kubenswrapper[4892]: I1006 12:56:18.529641 4892 generic.go:334] "Generic (PLEG): container finished" podID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerID="b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b" exitCode=0 Oct 06 12:56:18 crc kubenswrapper[4892]: I1006 12:56:18.529970 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdxj7" event={"ID":"1b2d464c-25a8-48fb-b695-45486b4257cf","Type":"ContainerDied","Data":"b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b"} Oct 06 12:56:18 crc kubenswrapper[4892]: I1006 12:56:18.847014 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:56:18 crc kubenswrapper[4892]: I1006 12:56:18.847666 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="prometheus" containerID="cri-o://0cee2e277f10a258841ce91b152eb5485f29229b28557888fb3b6f98b8b3e42e" gracePeriod=600 Oct 06 12:56:18 crc kubenswrapper[4892]: I1006 12:56:18.847847 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="config-reloader" containerID="cri-o://9bf96ce1640bcc275470f6e5513f22f48c5d3c0e0f3ca083a65b56c0dc15231d" gracePeriod=600 Oct 06 12:56:18 crc kubenswrapper[4892]: I1006 12:56:18.847908 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="thanos-sidecar" containerID="cri-o://6174bee3a7c656575e63cd034f8f5f86c38d4548b4770b38f3d52c08f17f05be" gracePeriod=600 Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.545140 4892 generic.go:334] "Generic (PLEG): container finished" podID="10380cce-a552-488a-8157-ea8425662776" containerID="6174bee3a7c656575e63cd034f8f5f86c38d4548b4770b38f3d52c08f17f05be" exitCode=0 Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.545454 4892 generic.go:334] "Generic (PLEG): container finished" podID="10380cce-a552-488a-8157-ea8425662776" containerID="9bf96ce1640bcc275470f6e5513f22f48c5d3c0e0f3ca083a65b56c0dc15231d" exitCode=0 Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.545463 4892 generic.go:334] "Generic (PLEG): container finished" podID="10380cce-a552-488a-8157-ea8425662776" containerID="0cee2e277f10a258841ce91b152eb5485f29229b28557888fb3b6f98b8b3e42e" exitCode=0 Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.545213 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerDied","Data":"6174bee3a7c656575e63cd034f8f5f86c38d4548b4770b38f3d52c08f17f05be"} Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.545518 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerDied","Data":"9bf96ce1640bcc275470f6e5513f22f48c5d3c0e0f3ca083a65b56c0dc15231d"} Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.545530 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerDied","Data":"0cee2e277f10a258841ce91b152eb5485f29229b28557888fb3b6f98b8b3e42e"} Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.547728 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdxj7" event={"ID":"1b2d464c-25a8-48fb-b695-45486b4257cf","Type":"ContainerStarted","Data":"0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a"} Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.571516 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bdxj7" podStartSLOduration=3.122337647 podStartE2EDuration="5.571490708s" podCreationTimestamp="2025-10-06 12:56:14 +0000 UTC" firstStartedPulling="2025-10-06 12:56:16.513064719 +0000 UTC m=+2863.062770514" lastFinishedPulling="2025-10-06 12:56:18.96221781 +0000 UTC m=+2865.511923575" observedRunningTime="2025-10-06 12:56:19.565047263 +0000 UTC m=+2866.114753028" watchObservedRunningTime="2025-10-06 12:56:19.571490708 +0000 UTC m=+2866.121196473" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.884849 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950185 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950259 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-thanos-prometheus-http-client-file\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950387 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950448 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950485 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9bjb\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-kube-api-access-q9bjb\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950567 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-secret-combined-ca-bundle\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950602 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10380cce-a552-488a-8157-ea8425662776-prometheus-metric-storage-rulefiles-0\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950635 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10380cce-a552-488a-8157-ea8425662776-config-out\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950664 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-config\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950692 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.950734 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-tls-assets\") pod \"10380cce-a552-488a-8157-ea8425662776\" (UID: \"10380cce-a552-488a-8157-ea8425662776\") " Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.952127 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10380cce-a552-488a-8157-ea8425662776-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.959478 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10380cce-a552-488a-8157-ea8425662776-config-out" (OuterVolumeSpecName: "config-out") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.961375 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.962827 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.962880 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.962970 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.963519 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.969920 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-kube-api-access-q9bjb" (OuterVolumeSpecName: "kube-api-access-q9bjb") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "kube-api-access-q9bjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.974246 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-config" (OuterVolumeSpecName: "config") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:56:19 crc kubenswrapper[4892]: I1006 12:56:19.987258 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.053977 4892 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054026 4892 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/10380cce-a552-488a-8157-ea8425662776-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054038 4892 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/10380cce-a552-488a-8157-ea8425662776-config-out\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054048 4892 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054056 4892 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054084 4892 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") on node \"crc\" " Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054095 4892 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054105 4892 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054118 4892 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.054128 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9bjb\" (UniqueName: \"kubernetes.io/projected/10380cce-a552-488a-8157-ea8425662776-kube-api-access-q9bjb\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.071463 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config" (OuterVolumeSpecName: "web-config") pod "10380cce-a552-488a-8157-ea8425662776" (UID: "10380cce-a552-488a-8157-ea8425662776"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.083873 4892 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.085125 4892 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa") on node "crc" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.156677 4892 reconciler_common.go:293] "Volume detached for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.156717 4892 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/10380cce-a552-488a-8157-ea8425662776-web-config\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.558664 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"10380cce-a552-488a-8157-ea8425662776","Type":"ContainerDied","Data":"1d082586dc4ded12b3a47603e3687cdca460014ec3a897ccea43b6bebab2141a"} Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.558702 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.558727 4892 scope.go:117] "RemoveContainer" containerID="6174bee3a7c656575e63cd034f8f5f86c38d4548b4770b38f3d52c08f17f05be" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.596481 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.602361 4892 scope.go:117] "RemoveContainer" containerID="9bf96ce1640bcc275470f6e5513f22f48c5d3c0e0f3ca083a65b56c0dc15231d" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.612342 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.624622 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:56:20 crc kubenswrapper[4892]: E1006 12:56:20.625046 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="prometheus" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.625068 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="prometheus" Oct 06 12:56:20 crc kubenswrapper[4892]: E1006 12:56:20.625097 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="init-config-reloader" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.625104 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="init-config-reloader" Oct 06 12:56:20 crc kubenswrapper[4892]: E1006 12:56:20.625119 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="thanos-sidecar" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.625126 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="thanos-sidecar" Oct 06 12:56:20 crc kubenswrapper[4892]: E1006 12:56:20.625138 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="config-reloader" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.625144 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="config-reloader" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.625422 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="thanos-sidecar" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.625450 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="config-reloader" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.625469 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="10380cce-a552-488a-8157-ea8425662776" containerName="prometheus" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.632837 4892 scope.go:117] "RemoveContainer" containerID="0cee2e277f10a258841ce91b152eb5485f29229b28557888fb3b6f98b8b3e42e" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.639422 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.639541 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.642742 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-x7smc" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.642872 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.642928 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.654283 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.654583 4892 scope.go:117] "RemoveContainer" containerID="de88c8c1c78db93f82cbf007c5d171de907a99ee8d91ee70708b4b1725246986" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.665142 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.671521 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768240 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768302 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768338 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768364 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f636c8ba-cc7f-420c-8847-ad1ecf766974-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768385 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768454 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5b5n\" (UniqueName: \"kubernetes.io/projected/f636c8ba-cc7f-420c-8847-ad1ecf766974-kube-api-access-h5b5n\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768507 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f636c8ba-cc7f-420c-8847-ad1ecf766974-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768583 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768794 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f636c8ba-cc7f-420c-8847-ad1ecf766974-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.768942 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.769172 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-config\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.871821 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.871895 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.871936 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f636c8ba-cc7f-420c-8847-ad1ecf766974-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.871975 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.872009 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5b5n\" (UniqueName: \"kubernetes.io/projected/f636c8ba-cc7f-420c-8847-ad1ecf766974-kube-api-access-h5b5n\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.872042 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f636c8ba-cc7f-420c-8847-ad1ecf766974-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.872102 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.872176 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f636c8ba-cc7f-420c-8847-ad1ecf766974-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.872243 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.872431 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-config\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.872504 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.879701 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.880127 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.880533 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f636c8ba-cc7f-420c-8847-ad1ecf766974-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.885637 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.886219 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.889436 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f636c8ba-cc7f-420c-8847-ad1ecf766974-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.890767 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f636c8ba-cc7f-420c-8847-ad1ecf766974-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.897257 4892 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.897364 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b0f201420d0adafcb475a965fbfd99b4a272413cc10e31ea76ae8257a696a4f5/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.898106 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.904252 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f636c8ba-cc7f-420c-8847-ad1ecf766974-config\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.914830 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5b5n\" (UniqueName: \"kubernetes.io/projected/f636c8ba-cc7f-420c-8847-ad1ecf766974-kube-api-access-h5b5n\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:20 crc kubenswrapper[4892]: I1006 12:56:20.990611 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1e5af42b-8548-4ee1-9969-6695b8ebb9fa\") pod \"prometheus-metric-storage-0\" (UID: \"f636c8ba-cc7f-420c-8847-ad1ecf766974\") " pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:21 crc kubenswrapper[4892]: I1006 12:56:21.267773 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:21 crc kubenswrapper[4892]: I1006 12:56:21.779394 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 06 12:56:22 crc kubenswrapper[4892]: I1006 12:56:22.187199 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10380cce-a552-488a-8157-ea8425662776" path="/var/lib/kubelet/pods/10380cce-a552-488a-8157-ea8425662776/volumes" Oct 06 12:56:22 crc kubenswrapper[4892]: I1006 12:56:22.583365 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f636c8ba-cc7f-420c-8847-ad1ecf766974","Type":"ContainerStarted","Data":"b14dbed2d50118c123f8c111ce16388e684aabbf1857e1ae3f33777846100ded"} Oct 06 12:56:22 crc kubenswrapper[4892]: I1006 12:56:22.985013 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:56:22 crc kubenswrapper[4892]: I1006 12:56:22.985069 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:56:24 crc kubenswrapper[4892]: I1006 12:56:24.852761 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:24 crc kubenswrapper[4892]: I1006 12:56:24.853081 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:25 crc kubenswrapper[4892]: I1006 12:56:25.137236 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:25 crc kubenswrapper[4892]: I1006 12:56:25.734855 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:26 crc kubenswrapper[4892]: I1006 12:56:26.649852 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f636c8ba-cc7f-420c-8847-ad1ecf766974","Type":"ContainerStarted","Data":"6e338d51ad18053bf3e31adaed63506fa2257f6259fb221907b2bc729021bda9"} Oct 06 12:56:27 crc kubenswrapper[4892]: I1006 12:56:27.093643 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdxj7"] Oct 06 12:56:27 crc kubenswrapper[4892]: I1006 12:56:27.663741 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bdxj7" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerName="registry-server" containerID="cri-o://0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a" gracePeriod=2 Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.186368 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.226647 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-catalog-content\") pod \"1b2d464c-25a8-48fb-b695-45486b4257cf\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.226775 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xvmg\" (UniqueName: \"kubernetes.io/projected/1b2d464c-25a8-48fb-b695-45486b4257cf-kube-api-access-5xvmg\") pod \"1b2d464c-25a8-48fb-b695-45486b4257cf\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.226811 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-utilities\") pod \"1b2d464c-25a8-48fb-b695-45486b4257cf\" (UID: \"1b2d464c-25a8-48fb-b695-45486b4257cf\") " Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.228028 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-utilities" (OuterVolumeSpecName: "utilities") pod "1b2d464c-25a8-48fb-b695-45486b4257cf" (UID: "1b2d464c-25a8-48fb-b695-45486b4257cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.230284 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.233021 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2d464c-25a8-48fb-b695-45486b4257cf-kube-api-access-5xvmg" (OuterVolumeSpecName: "kube-api-access-5xvmg") pod "1b2d464c-25a8-48fb-b695-45486b4257cf" (UID: "1b2d464c-25a8-48fb-b695-45486b4257cf"). InnerVolumeSpecName "kube-api-access-5xvmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.242128 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b2d464c-25a8-48fb-b695-45486b4257cf" (UID: "1b2d464c-25a8-48fb-b695-45486b4257cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.333171 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b2d464c-25a8-48fb-b695-45486b4257cf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.333228 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xvmg\" (UniqueName: \"kubernetes.io/projected/1b2d464c-25a8-48fb-b695-45486b4257cf-kube-api-access-5xvmg\") on node \"crc\" DevicePath \"\"" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.690820 4892 generic.go:334] "Generic (PLEG): container finished" podID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerID="0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a" exitCode=0 Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.690913 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdxj7" event={"ID":"1b2d464c-25a8-48fb-b695-45486b4257cf","Type":"ContainerDied","Data":"0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a"} Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.690996 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bdxj7" event={"ID":"1b2d464c-25a8-48fb-b695-45486b4257cf","Type":"ContainerDied","Data":"2f60304f6d4b20dc75a9bc51f9a326604b9e4fc6c007b21ae9a4bbc3ba8741a8"} Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.690998 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bdxj7" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.691027 4892 scope.go:117] "RemoveContainer" containerID="0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.720078 4892 scope.go:117] "RemoveContainer" containerID="b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.772438 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdxj7"] Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.772607 4892 scope.go:117] "RemoveContainer" containerID="0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.784788 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bdxj7"] Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.817402 4892 scope.go:117] "RemoveContainer" containerID="0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a" Oct 06 12:56:28 crc kubenswrapper[4892]: E1006 12:56:28.817851 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a\": container with ID starting with 0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a not found: ID does not exist" containerID="0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.817891 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a"} err="failed to get container status \"0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a\": rpc error: code = NotFound desc = could not find container \"0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a\": container with ID starting with 0e843f38d46e982a49246a66888e6238cbf5922ff00e6b9c9c83f34248fbc47a not found: ID does not exist" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.817916 4892 scope.go:117] "RemoveContainer" containerID="b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b" Oct 06 12:56:28 crc kubenswrapper[4892]: E1006 12:56:28.818205 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b\": container with ID starting with b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b not found: ID does not exist" containerID="b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.818255 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b"} err="failed to get container status \"b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b\": rpc error: code = NotFound desc = could not find container \"b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b\": container with ID starting with b75b0d9dc1941f3bf121bed9a196511147fc27de53404b348ec96d72c21bf12b not found: ID does not exist" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.818288 4892 scope.go:117] "RemoveContainer" containerID="0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4" Oct 06 12:56:28 crc kubenswrapper[4892]: E1006 12:56:28.818857 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4\": container with ID starting with 0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4 not found: ID does not exist" containerID="0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4" Oct 06 12:56:28 crc kubenswrapper[4892]: I1006 12:56:28.818910 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4"} err="failed to get container status \"0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4\": rpc error: code = NotFound desc = could not find container \"0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4\": container with ID starting with 0d12d852c7ba04b5445710431e4cca181867f099ca62a2b3dca2dc1cb0f052b4 not found: ID does not exist" Oct 06 12:56:30 crc kubenswrapper[4892]: I1006 12:56:30.193299 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" path="/var/lib/kubelet/pods/1b2d464c-25a8-48fb-b695-45486b4257cf/volumes" Oct 06 12:56:36 crc kubenswrapper[4892]: I1006 12:56:36.800407 4892 generic.go:334] "Generic (PLEG): container finished" podID="f636c8ba-cc7f-420c-8847-ad1ecf766974" containerID="6e338d51ad18053bf3e31adaed63506fa2257f6259fb221907b2bc729021bda9" exitCode=0 Oct 06 12:56:36 crc kubenswrapper[4892]: I1006 12:56:36.800621 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f636c8ba-cc7f-420c-8847-ad1ecf766974","Type":"ContainerDied","Data":"6e338d51ad18053bf3e31adaed63506fa2257f6259fb221907b2bc729021bda9"} Oct 06 12:56:37 crc kubenswrapper[4892]: I1006 12:56:37.813477 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f636c8ba-cc7f-420c-8847-ad1ecf766974","Type":"ContainerStarted","Data":"8e2d8ba6bce82121da4b708450e4a2a5e0c90769cde65a72f2f1b5045ca5b8e5"} Oct 06 12:56:41 crc kubenswrapper[4892]: I1006 12:56:41.865183 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f636c8ba-cc7f-420c-8847-ad1ecf766974","Type":"ContainerStarted","Data":"8f26d7273d610f56854745568d329ebd6cbf767f90374811aa055705351a04d8"} Oct 06 12:56:41 crc kubenswrapper[4892]: I1006 12:56:41.865655 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f636c8ba-cc7f-420c-8847-ad1ecf766974","Type":"ContainerStarted","Data":"3f298600a87caf56074e3d17afae0c2c9757ec11f2399ab709c16c8fb6b61bd9"} Oct 06 12:56:41 crc kubenswrapper[4892]: I1006 12:56:41.912845 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.912824379 podStartE2EDuration="21.912824379s" podCreationTimestamp="2025-10-06 12:56:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 12:56:41.902555144 +0000 UTC m=+2888.452260969" watchObservedRunningTime="2025-10-06 12:56:41.912824379 +0000 UTC m=+2888.462530154" Oct 06 12:56:46 crc kubenswrapper[4892]: I1006 12:56:46.268805 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:51 crc kubenswrapper[4892]: I1006 12:56:51.269689 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:51 crc kubenswrapper[4892]: I1006 12:56:51.286135 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:51 crc kubenswrapper[4892]: I1006 12:56:51.977406 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 06 12:56:52 crc kubenswrapper[4892]: I1006 12:56:52.984424 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:56:52 crc kubenswrapper[4892]: I1006 12:56:52.984499 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.270906 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 12:57:14 crc kubenswrapper[4892]: E1006 12:57:14.271971 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerName="registry-server" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.271988 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerName="registry-server" Oct 06 12:57:14 crc kubenswrapper[4892]: E1006 12:57:14.272018 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerName="extract-content" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.272027 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerName="extract-content" Oct 06 12:57:14 crc kubenswrapper[4892]: E1006 12:57:14.272061 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerName="extract-utilities" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.272070 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerName="extract-utilities" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.272313 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2d464c-25a8-48fb-b695-45486b4257cf" containerName="registry-server" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.273346 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.275486 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.276581 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.276681 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zcjzz" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.277289 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.284217 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.397693 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.397750 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.397954 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.398006 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.398083 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2443f61e-fb23-4eeb-9e36-7ee51d31b322-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.398171 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz5nc\" (UniqueName: \"kubernetes.io/projected/2443f61e-fb23-4eeb-9e36-7ee51d31b322-kube-api-access-bz5nc\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.398271 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2443f61e-fb23-4eeb-9e36-7ee51d31b322-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.398366 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2443f61e-fb23-4eeb-9e36-7ee51d31b322-config-data\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.398470 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2443f61e-fb23-4eeb-9e36-7ee51d31b322-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500197 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz5nc\" (UniqueName: \"kubernetes.io/projected/2443f61e-fb23-4eeb-9e36-7ee51d31b322-kube-api-access-bz5nc\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500267 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2443f61e-fb23-4eeb-9e36-7ee51d31b322-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500309 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2443f61e-fb23-4eeb-9e36-7ee51d31b322-config-data\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500362 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2443f61e-fb23-4eeb-9e36-7ee51d31b322-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500410 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500438 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500482 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500506 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500522 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2443f61e-fb23-4eeb-9e36-7ee51d31b322-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500765 4892 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500846 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/2443f61e-fb23-4eeb-9e36-7ee51d31b322-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.500895 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/2443f61e-fb23-4eeb-9e36-7ee51d31b322-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.501412 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/2443f61e-fb23-4eeb-9e36-7ee51d31b322-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.501696 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2443f61e-fb23-4eeb-9e36-7ee51d31b322-config-data\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.507297 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.507783 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.507814 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/2443f61e-fb23-4eeb-9e36-7ee51d31b322-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.515448 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz5nc\" (UniqueName: \"kubernetes.io/projected/2443f61e-fb23-4eeb-9e36-7ee51d31b322-kube-api-access-bz5nc\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.552547 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"2443f61e-fb23-4eeb-9e36-7ee51d31b322\") " pod="openstack/tempest-tests-tempest" Oct 06 12:57:14 crc kubenswrapper[4892]: I1006 12:57:14.601193 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 06 12:57:15 crc kubenswrapper[4892]: I1006 12:57:15.104835 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 06 12:57:15 crc kubenswrapper[4892]: I1006 12:57:15.274418 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2443f61e-fb23-4eeb-9e36-7ee51d31b322","Type":"ContainerStarted","Data":"671536894c0792c0fc8a9146484ae1b29882123bce116112dc5e83f95170d087"} Oct 06 12:57:22 crc kubenswrapper[4892]: I1006 12:57:22.984123 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:57:22 crc kubenswrapper[4892]: I1006 12:57:22.984888 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 12:57:22 crc kubenswrapper[4892]: I1006 12:57:22.984944 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 12:57:22 crc kubenswrapper[4892]: I1006 12:57:22.985813 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d194753ef4190e3702b4f0846aa3401fc1db7562b22faa07ffe712a7046afbbf"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 12:57:22 crc kubenswrapper[4892]: I1006 12:57:22.985878 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://d194753ef4190e3702b4f0846aa3401fc1db7562b22faa07ffe712a7046afbbf" gracePeriod=600 Oct 06 12:57:23 crc kubenswrapper[4892]: I1006 12:57:23.403531 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="d194753ef4190e3702b4f0846aa3401fc1db7562b22faa07ffe712a7046afbbf" exitCode=0 Oct 06 12:57:23 crc kubenswrapper[4892]: I1006 12:57:23.403589 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"d194753ef4190e3702b4f0846aa3401fc1db7562b22faa07ffe712a7046afbbf"} Oct 06 12:57:23 crc kubenswrapper[4892]: I1006 12:57:23.403625 4892 scope.go:117] "RemoveContainer" containerID="1287309db37c098cd7f362bfc0f4fd7cc946024144da7aed08b50801cbbd44ec" Oct 06 12:57:26 crc kubenswrapper[4892]: I1006 12:57:26.450835 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264"} Oct 06 12:57:26 crc kubenswrapper[4892]: I1006 12:57:26.453216 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"2443f61e-fb23-4eeb-9e36-7ee51d31b322","Type":"ContainerStarted","Data":"cb1497f422a679fd7cb4cc3cc1bbcade9716517ee6e59fdea3bfa1b04bb4d3eb"} Oct 06 12:57:26 crc kubenswrapper[4892]: I1006 12:57:26.510446 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.525729359 podStartE2EDuration="13.510413911s" podCreationTimestamp="2025-10-06 12:57:13 +0000 UTC" firstStartedPulling="2025-10-06 12:57:15.109066106 +0000 UTC m=+2921.658771881" lastFinishedPulling="2025-10-06 12:57:25.093750628 +0000 UTC m=+2931.643456433" observedRunningTime="2025-10-06 12:57:26.497771748 +0000 UTC m=+2933.047477523" watchObservedRunningTime="2025-10-06 12:57:26.510413911 +0000 UTC m=+2933.060119666" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.664992 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qbntz"] Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.672559 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.683300 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbntz"] Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.793803 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-utilities\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.793888 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvwwq\" (UniqueName: \"kubernetes.io/projected/e0ce7a9a-f99c-4421-b226-9402e1b415d4-kube-api-access-dvwwq\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.794986 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-catalog-content\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.898033 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-catalog-content\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.898638 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-utilities\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.898672 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvwwq\" (UniqueName: \"kubernetes.io/projected/e0ce7a9a-f99c-4421-b226-9402e1b415d4-kube-api-access-dvwwq\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.899077 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-catalog-content\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.899209 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-utilities\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:22 crc kubenswrapper[4892]: I1006 12:59:22.927342 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvwwq\" (UniqueName: \"kubernetes.io/projected/e0ce7a9a-f99c-4421-b226-9402e1b415d4-kube-api-access-dvwwq\") pod \"community-operators-qbntz\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:23 crc kubenswrapper[4892]: I1006 12:59:23.018847 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:23 crc kubenswrapper[4892]: I1006 12:59:23.493721 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qbntz"] Oct 06 12:59:23 crc kubenswrapper[4892]: I1006 12:59:23.849975 4892 generic.go:334] "Generic (PLEG): container finished" podID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerID="a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2" exitCode=0 Oct 06 12:59:23 crc kubenswrapper[4892]: I1006 12:59:23.850049 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbntz" event={"ID":"e0ce7a9a-f99c-4421-b226-9402e1b415d4","Type":"ContainerDied","Data":"a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2"} Oct 06 12:59:23 crc kubenswrapper[4892]: I1006 12:59:23.850460 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbntz" event={"ID":"e0ce7a9a-f99c-4421-b226-9402e1b415d4","Type":"ContainerStarted","Data":"ab55f8a627f41149fef391612a2bdc8a6c86390c791a4bc91273031847fc647d"} Oct 06 12:59:25 crc kubenswrapper[4892]: I1006 12:59:25.878191 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbntz" event={"ID":"e0ce7a9a-f99c-4421-b226-9402e1b415d4","Type":"ContainerStarted","Data":"e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a"} Oct 06 12:59:26 crc kubenswrapper[4892]: I1006 12:59:26.898547 4892 generic.go:334] "Generic (PLEG): container finished" podID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerID="e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a" exitCode=0 Oct 06 12:59:26 crc kubenswrapper[4892]: I1006 12:59:26.898675 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbntz" event={"ID":"e0ce7a9a-f99c-4421-b226-9402e1b415d4","Type":"ContainerDied","Data":"e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a"} Oct 06 12:59:27 crc kubenswrapper[4892]: I1006 12:59:27.911708 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbntz" event={"ID":"e0ce7a9a-f99c-4421-b226-9402e1b415d4","Type":"ContainerStarted","Data":"93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e"} Oct 06 12:59:27 crc kubenswrapper[4892]: I1006 12:59:27.937408 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qbntz" podStartSLOduration=2.343337149 podStartE2EDuration="5.937388388s" podCreationTimestamp="2025-10-06 12:59:22 +0000 UTC" firstStartedPulling="2025-10-06 12:59:23.851918739 +0000 UTC m=+3050.401624504" lastFinishedPulling="2025-10-06 12:59:27.445969968 +0000 UTC m=+3053.995675743" observedRunningTime="2025-10-06 12:59:27.929607224 +0000 UTC m=+3054.479313029" watchObservedRunningTime="2025-10-06 12:59:27.937388388 +0000 UTC m=+3054.487094153" Oct 06 12:59:33 crc kubenswrapper[4892]: I1006 12:59:33.019953 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:33 crc kubenswrapper[4892]: I1006 12:59:33.020647 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:33 crc kubenswrapper[4892]: I1006 12:59:33.079863 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:34 crc kubenswrapper[4892]: I1006 12:59:34.055572 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:34 crc kubenswrapper[4892]: I1006 12:59:34.125100 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbntz"] Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.019981 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qbntz" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerName="registry-server" containerID="cri-o://93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e" gracePeriod=2 Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.627696 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.721686 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvwwq\" (UniqueName: \"kubernetes.io/projected/e0ce7a9a-f99c-4421-b226-9402e1b415d4-kube-api-access-dvwwq\") pod \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.722249 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-utilities\") pod \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.722295 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-catalog-content\") pod \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\" (UID: \"e0ce7a9a-f99c-4421-b226-9402e1b415d4\") " Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.723912 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-utilities" (OuterVolumeSpecName: "utilities") pod "e0ce7a9a-f99c-4421-b226-9402e1b415d4" (UID: "e0ce7a9a-f99c-4421-b226-9402e1b415d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.732029 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ce7a9a-f99c-4421-b226-9402e1b415d4-kube-api-access-dvwwq" (OuterVolumeSpecName: "kube-api-access-dvwwq") pod "e0ce7a9a-f99c-4421-b226-9402e1b415d4" (UID: "e0ce7a9a-f99c-4421-b226-9402e1b415d4"). InnerVolumeSpecName "kube-api-access-dvwwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.803230 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0ce7a9a-f99c-4421-b226-9402e1b415d4" (UID: "e0ce7a9a-f99c-4421-b226-9402e1b415d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.825821 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvwwq\" (UniqueName: \"kubernetes.io/projected/e0ce7a9a-f99c-4421-b226-9402e1b415d4-kube-api-access-dvwwq\") on node \"crc\" DevicePath \"\"" Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.825876 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 12:59:36 crc kubenswrapper[4892]: I1006 12:59:36.825901 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ce7a9a-f99c-4421-b226-9402e1b415d4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.033119 4892 generic.go:334] "Generic (PLEG): container finished" podID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerID="93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e" exitCode=0 Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.033169 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbntz" event={"ID":"e0ce7a9a-f99c-4421-b226-9402e1b415d4","Type":"ContainerDied","Data":"93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e"} Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.033220 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qbntz" event={"ID":"e0ce7a9a-f99c-4421-b226-9402e1b415d4","Type":"ContainerDied","Data":"ab55f8a627f41149fef391612a2bdc8a6c86390c791a4bc91273031847fc647d"} Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.033244 4892 scope.go:117] "RemoveContainer" containerID="93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.033257 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qbntz" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.064843 4892 scope.go:117] "RemoveContainer" containerID="e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.083250 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qbntz"] Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.097708 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qbntz"] Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.099977 4892 scope.go:117] "RemoveContainer" containerID="a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.165201 4892 scope.go:117] "RemoveContainer" containerID="93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e" Oct 06 12:59:37 crc kubenswrapper[4892]: E1006 12:59:37.168065 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e\": container with ID starting with 93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e not found: ID does not exist" containerID="93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.168111 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e"} err="failed to get container status \"93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e\": rpc error: code = NotFound desc = could not find container \"93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e\": container with ID starting with 93d9f91aa78258511dcdee7252f77f792737eabb1d801c3e1796b4b4a7ca860e not found: ID does not exist" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.168138 4892 scope.go:117] "RemoveContainer" containerID="e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a" Oct 06 12:59:37 crc kubenswrapper[4892]: E1006 12:59:37.173479 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a\": container with ID starting with e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a not found: ID does not exist" containerID="e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.173538 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a"} err="failed to get container status \"e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a\": rpc error: code = NotFound desc = could not find container \"e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a\": container with ID starting with e0eb710b92d711c91952186b6390dddadfa473c952708cfb28d26c6f7a1a605a not found: ID does not exist" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.173578 4892 scope.go:117] "RemoveContainer" containerID="a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2" Oct 06 12:59:37 crc kubenswrapper[4892]: E1006 12:59:37.174808 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2\": container with ID starting with a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2 not found: ID does not exist" containerID="a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2" Oct 06 12:59:37 crc kubenswrapper[4892]: I1006 12:59:37.174855 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2"} err="failed to get container status \"a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2\": rpc error: code = NotFound desc = could not find container \"a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2\": container with ID starting with a23f61a0bbf53c8ad4b8b06923ab637fc6353c410f9f43bb825d5a96ec504db2 not found: ID does not exist" Oct 06 12:59:38 crc kubenswrapper[4892]: I1006 12:59:38.181820 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" path="/var/lib/kubelet/pods/e0ce7a9a-f99c-4421-b226-9402e1b415d4/volumes" Oct 06 12:59:52 crc kubenswrapper[4892]: I1006 12:59:52.985076 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 12:59:52 crc kubenswrapper[4892]: I1006 12:59:52.985958 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.186192 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq"] Oct 06 13:00:00 crc kubenswrapper[4892]: E1006 13:00:00.187119 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerName="extract-content" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.187132 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerName="extract-content" Oct 06 13:00:00 crc kubenswrapper[4892]: E1006 13:00:00.187153 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerName="extract-utilities" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.187159 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerName="extract-utilities" Oct 06 13:00:00 crc kubenswrapper[4892]: E1006 13:00:00.187172 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerName="registry-server" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.187179 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerName="registry-server" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.187443 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ce7a9a-f99c-4421-b226-9402e1b415d4" containerName="registry-server" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.188111 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.190787 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.199131 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.204373 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq"] Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.243159 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316d8af4-af87-4abf-b86a-3059a5f365ec-secret-volume\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.244282 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316d8af4-af87-4abf-b86a-3059a5f365ec-config-volume\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.244534 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg8z\" (UniqueName: \"kubernetes.io/projected/316d8af4-af87-4abf-b86a-3059a5f365ec-kube-api-access-fgg8z\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.346274 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316d8af4-af87-4abf-b86a-3059a5f365ec-secret-volume\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.346478 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316d8af4-af87-4abf-b86a-3059a5f365ec-config-volume\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.346539 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg8z\" (UniqueName: \"kubernetes.io/projected/316d8af4-af87-4abf-b86a-3059a5f365ec-kube-api-access-fgg8z\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.347620 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316d8af4-af87-4abf-b86a-3059a5f365ec-config-volume\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.353977 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316d8af4-af87-4abf-b86a-3059a5f365ec-secret-volume\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.371071 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg8z\" (UniqueName: \"kubernetes.io/projected/316d8af4-af87-4abf-b86a-3059a5f365ec-kube-api-access-fgg8z\") pod \"collect-profiles-29329260-jpcfq\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.509565 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:00 crc kubenswrapper[4892]: I1006 13:00:00.986724 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq"] Oct 06 13:00:01 crc kubenswrapper[4892]: I1006 13:00:01.329947 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" event={"ID":"316d8af4-af87-4abf-b86a-3059a5f365ec","Type":"ContainerStarted","Data":"6000c519e8ad01f46518a4edf9bda39dd0b7a08be6add40af33ecf5e11f5d034"} Oct 06 13:00:01 crc kubenswrapper[4892]: I1006 13:00:01.330279 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" event={"ID":"316d8af4-af87-4abf-b86a-3059a5f365ec","Type":"ContainerStarted","Data":"086259dc1bebf0bb0f46f01e9d5e770136e417a1c7690ec0302a280e09862e65"} Oct 06 13:00:01 crc kubenswrapper[4892]: I1006 13:00:01.349055 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" podStartSLOduration=1.349035709 podStartE2EDuration="1.349035709s" podCreationTimestamp="2025-10-06 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:00:01.342959724 +0000 UTC m=+3087.892665489" watchObservedRunningTime="2025-10-06 13:00:01.349035709 +0000 UTC m=+3087.898741474" Oct 06 13:00:02 crc kubenswrapper[4892]: I1006 13:00:02.341832 4892 generic.go:334] "Generic (PLEG): container finished" podID="316d8af4-af87-4abf-b86a-3059a5f365ec" containerID="6000c519e8ad01f46518a4edf9bda39dd0b7a08be6add40af33ecf5e11f5d034" exitCode=0 Oct 06 13:00:02 crc kubenswrapper[4892]: I1006 13:00:02.341890 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" event={"ID":"316d8af4-af87-4abf-b86a-3059a5f365ec","Type":"ContainerDied","Data":"6000c519e8ad01f46518a4edf9bda39dd0b7a08be6add40af33ecf5e11f5d034"} Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.710194 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.808450 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316d8af4-af87-4abf-b86a-3059a5f365ec-secret-volume\") pod \"316d8af4-af87-4abf-b86a-3059a5f365ec\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.808513 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgg8z\" (UniqueName: \"kubernetes.io/projected/316d8af4-af87-4abf-b86a-3059a5f365ec-kube-api-access-fgg8z\") pod \"316d8af4-af87-4abf-b86a-3059a5f365ec\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.808577 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316d8af4-af87-4abf-b86a-3059a5f365ec-config-volume\") pod \"316d8af4-af87-4abf-b86a-3059a5f365ec\" (UID: \"316d8af4-af87-4abf-b86a-3059a5f365ec\") " Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.809443 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316d8af4-af87-4abf-b86a-3059a5f365ec-config-volume" (OuterVolumeSpecName: "config-volume") pod "316d8af4-af87-4abf-b86a-3059a5f365ec" (UID: "316d8af4-af87-4abf-b86a-3059a5f365ec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.815310 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316d8af4-af87-4abf-b86a-3059a5f365ec-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "316d8af4-af87-4abf-b86a-3059a5f365ec" (UID: "316d8af4-af87-4abf-b86a-3059a5f365ec"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.815487 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316d8af4-af87-4abf-b86a-3059a5f365ec-kube-api-access-fgg8z" (OuterVolumeSpecName: "kube-api-access-fgg8z") pod "316d8af4-af87-4abf-b86a-3059a5f365ec" (UID: "316d8af4-af87-4abf-b86a-3059a5f365ec"). InnerVolumeSpecName "kube-api-access-fgg8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.911058 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316d8af4-af87-4abf-b86a-3059a5f365ec-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.911091 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgg8z\" (UniqueName: \"kubernetes.io/projected/316d8af4-af87-4abf-b86a-3059a5f365ec-kube-api-access-fgg8z\") on node \"crc\" DevicePath \"\"" Oct 06 13:00:03 crc kubenswrapper[4892]: I1006 13:00:03.911104 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316d8af4-af87-4abf-b86a-3059a5f365ec-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:00:04 crc kubenswrapper[4892]: I1006 13:00:04.371627 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" event={"ID":"316d8af4-af87-4abf-b86a-3059a5f365ec","Type":"ContainerDied","Data":"086259dc1bebf0bb0f46f01e9d5e770136e417a1c7690ec0302a280e09862e65"} Oct 06 13:00:04 crc kubenswrapper[4892]: I1006 13:00:04.371730 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="086259dc1bebf0bb0f46f01e9d5e770136e417a1c7690ec0302a280e09862e65" Oct 06 13:00:04 crc kubenswrapper[4892]: I1006 13:00:04.372105 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq" Oct 06 13:00:04 crc kubenswrapper[4892]: I1006 13:00:04.429244 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62"] Oct 06 13:00:04 crc kubenswrapper[4892]: I1006 13:00:04.439396 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329215-q2p62"] Oct 06 13:00:06 crc kubenswrapper[4892]: I1006 13:00:06.183444 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e968722-61cc-49e2-a817-981c8a48b4de" path="/var/lib/kubelet/pods/4e968722-61cc-49e2-a817-981c8a48b4de/volumes" Oct 06 13:00:22 crc kubenswrapper[4892]: I1006 13:00:22.984125 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:00:22 crc kubenswrapper[4892]: I1006 13:00:22.984736 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:00:52 crc kubenswrapper[4892]: I1006 13:00:52.984156 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:00:52 crc kubenswrapper[4892]: I1006 13:00:52.984778 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:00:52 crc kubenswrapper[4892]: I1006 13:00:52.984845 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:00:52 crc kubenswrapper[4892]: I1006 13:00:52.985942 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:00:52 crc kubenswrapper[4892]: I1006 13:00:52.986138 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" gracePeriod=600 Oct 06 13:00:53 crc kubenswrapper[4892]: E1006 13:00:53.110063 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:00:53 crc kubenswrapper[4892]: I1006 13:00:53.931463 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" exitCode=0 Oct 06 13:00:53 crc kubenswrapper[4892]: I1006 13:00:53.931594 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264"} Oct 06 13:00:53 crc kubenswrapper[4892]: I1006 13:00:53.932019 4892 scope.go:117] "RemoveContainer" containerID="d194753ef4190e3702b4f0846aa3401fc1db7562b22faa07ffe712a7046afbbf" Oct 06 13:00:53 crc kubenswrapper[4892]: I1006 13:00:53.933044 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:00:53 crc kubenswrapper[4892]: E1006 13:00:53.933886 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.169236 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29329261-8fz25"] Oct 06 13:01:00 crc kubenswrapper[4892]: E1006 13:01:00.170381 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316d8af4-af87-4abf-b86a-3059a5f365ec" containerName="collect-profiles" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.170397 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="316d8af4-af87-4abf-b86a-3059a5f365ec" containerName="collect-profiles" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.170868 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="316d8af4-af87-4abf-b86a-3059a5f365ec" containerName="collect-profiles" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.171792 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.194443 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329261-8fz25"] Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.373378 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-combined-ca-bundle\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.373523 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-config-data\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.373969 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbvz5\" (UniqueName: \"kubernetes.io/projected/5d32494b-495f-4f2b-bffd-e514f409a5fd-kube-api-access-sbvz5\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.374421 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-fernet-keys\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.476274 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-fernet-keys\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.476338 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-combined-ca-bundle\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.476368 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-config-data\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.476454 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbvz5\" (UniqueName: \"kubernetes.io/projected/5d32494b-495f-4f2b-bffd-e514f409a5fd-kube-api-access-sbvz5\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.485760 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-fernet-keys\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.485876 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-config-data\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.495674 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-combined-ca-bundle\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.498274 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbvz5\" (UniqueName: \"kubernetes.io/projected/5d32494b-495f-4f2b-bffd-e514f409a5fd-kube-api-access-sbvz5\") pod \"keystone-cron-29329261-8fz25\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.502113 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:00 crc kubenswrapper[4892]: I1006 13:01:00.968582 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329261-8fz25"] Oct 06 13:01:00 crc kubenswrapper[4892]: W1006 13:01:00.987737 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d32494b_495f_4f2b_bffd_e514f409a5fd.slice/crio-9105e901dc951e7f28d16b0b80c3971458a9f3b4f9a4e8298cb675a67abacfc5 WatchSource:0}: Error finding container 9105e901dc951e7f28d16b0b80c3971458a9f3b4f9a4e8298cb675a67abacfc5: Status 404 returned error can't find the container with id 9105e901dc951e7f28d16b0b80c3971458a9f3b4f9a4e8298cb675a67abacfc5 Oct 06 13:01:01 crc kubenswrapper[4892]: I1006 13:01:01.047612 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329261-8fz25" event={"ID":"5d32494b-495f-4f2b-bffd-e514f409a5fd","Type":"ContainerStarted","Data":"9105e901dc951e7f28d16b0b80c3971458a9f3b4f9a4e8298cb675a67abacfc5"} Oct 06 13:01:02 crc kubenswrapper[4892]: I1006 13:01:02.058227 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329261-8fz25" event={"ID":"5d32494b-495f-4f2b-bffd-e514f409a5fd","Type":"ContainerStarted","Data":"b4ea3b2798d17e20d683a60098d8b4bfaf0676dd167e66b9019d6f3d2aac9bee"} Oct 06 13:01:02 crc kubenswrapper[4892]: I1006 13:01:02.078031 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29329261-8fz25" podStartSLOduration=2.078012107 podStartE2EDuration="2.078012107s" podCreationTimestamp="2025-10-06 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 13:01:02.074720343 +0000 UTC m=+3148.624426108" watchObservedRunningTime="2025-10-06 13:01:02.078012107 +0000 UTC m=+3148.627717882" Oct 06 13:01:02 crc kubenswrapper[4892]: I1006 13:01:02.875922 4892 scope.go:117] "RemoveContainer" containerID="07b2d548f9b68e071ce29a00de8cafc935bcb05a12b3c95624fd3ffa8eb03f90" Oct 06 13:01:04 crc kubenswrapper[4892]: I1006 13:01:04.077595 4892 generic.go:334] "Generic (PLEG): container finished" podID="5d32494b-495f-4f2b-bffd-e514f409a5fd" containerID="b4ea3b2798d17e20d683a60098d8b4bfaf0676dd167e66b9019d6f3d2aac9bee" exitCode=0 Oct 06 13:01:04 crc kubenswrapper[4892]: I1006 13:01:04.077687 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329261-8fz25" event={"ID":"5d32494b-495f-4f2b-bffd-e514f409a5fd","Type":"ContainerDied","Data":"b4ea3b2798d17e20d683a60098d8b4bfaf0676dd167e66b9019d6f3d2aac9bee"} Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.492185 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.593303 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbvz5\" (UniqueName: \"kubernetes.io/projected/5d32494b-495f-4f2b-bffd-e514f409a5fd-kube-api-access-sbvz5\") pod \"5d32494b-495f-4f2b-bffd-e514f409a5fd\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.593931 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-config-data\") pod \"5d32494b-495f-4f2b-bffd-e514f409a5fd\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.594947 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-fernet-keys\") pod \"5d32494b-495f-4f2b-bffd-e514f409a5fd\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.595035 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-combined-ca-bundle\") pod \"5d32494b-495f-4f2b-bffd-e514f409a5fd\" (UID: \"5d32494b-495f-4f2b-bffd-e514f409a5fd\") " Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.600486 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d32494b-495f-4f2b-bffd-e514f409a5fd-kube-api-access-sbvz5" (OuterVolumeSpecName: "kube-api-access-sbvz5") pod "5d32494b-495f-4f2b-bffd-e514f409a5fd" (UID: "5d32494b-495f-4f2b-bffd-e514f409a5fd"). InnerVolumeSpecName "kube-api-access-sbvz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.603216 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5d32494b-495f-4f2b-bffd-e514f409a5fd" (UID: "5d32494b-495f-4f2b-bffd-e514f409a5fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.638208 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d32494b-495f-4f2b-bffd-e514f409a5fd" (UID: "5d32494b-495f-4f2b-bffd-e514f409a5fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.646784 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-config-data" (OuterVolumeSpecName: "config-data") pod "5d32494b-495f-4f2b-bffd-e514f409a5fd" (UID: "5d32494b-495f-4f2b-bffd-e514f409a5fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.697489 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbvz5\" (UniqueName: \"kubernetes.io/projected/5d32494b-495f-4f2b-bffd-e514f409a5fd-kube-api-access-sbvz5\") on node \"crc\" DevicePath \"\"" Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.697529 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.697538 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 13:01:05 crc kubenswrapper[4892]: I1006 13:01:05.697546 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d32494b-495f-4f2b-bffd-e514f409a5fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 13:01:06 crc kubenswrapper[4892]: I1006 13:01:06.100659 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329261-8fz25" event={"ID":"5d32494b-495f-4f2b-bffd-e514f409a5fd","Type":"ContainerDied","Data":"9105e901dc951e7f28d16b0b80c3971458a9f3b4f9a4e8298cb675a67abacfc5"} Oct 06 13:01:06 crc kubenswrapper[4892]: I1006 13:01:06.100707 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9105e901dc951e7f28d16b0b80c3971458a9f3b4f9a4e8298cb675a67abacfc5" Oct 06 13:01:06 crc kubenswrapper[4892]: I1006 13:01:06.100770 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329261-8fz25" Oct 06 13:01:08 crc kubenswrapper[4892]: I1006 13:01:08.170461 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:01:08 crc kubenswrapper[4892]: E1006 13:01:08.171569 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:01:21 crc kubenswrapper[4892]: I1006 13:01:21.168680 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:01:21 crc kubenswrapper[4892]: E1006 13:01:21.169914 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:01:34 crc kubenswrapper[4892]: I1006 13:01:34.181692 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:01:34 crc kubenswrapper[4892]: E1006 13:01:34.182812 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:01:46 crc kubenswrapper[4892]: I1006 13:01:46.168906 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:01:46 crc kubenswrapper[4892]: E1006 13:01:46.169806 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:01:58 crc kubenswrapper[4892]: I1006 13:01:58.172563 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:01:58 crc kubenswrapper[4892]: E1006 13:01:58.173278 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.582990 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4qwnl"] Oct 06 13:02:10 crc kubenswrapper[4892]: E1006 13:02:10.584131 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d32494b-495f-4f2b-bffd-e514f409a5fd" containerName="keystone-cron" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.584147 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d32494b-495f-4f2b-bffd-e514f409a5fd" containerName="keystone-cron" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.584419 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d32494b-495f-4f2b-bffd-e514f409a5fd" containerName="keystone-cron" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.586170 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.605609 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qwnl"] Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.669236 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-utilities\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.669359 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr99z\" (UniqueName: \"kubernetes.io/projected/10970580-2f08-430b-bd81-1a2183cb0980-kube-api-access-sr99z\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.669702 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-catalog-content\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.772204 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-utilities\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.772343 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr99z\" (UniqueName: \"kubernetes.io/projected/10970580-2f08-430b-bd81-1a2183cb0980-kube-api-access-sr99z\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.772430 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-catalog-content\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.772945 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-utilities\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.772996 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-catalog-content\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.798217 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr99z\" (UniqueName: \"kubernetes.io/projected/10970580-2f08-430b-bd81-1a2183cb0980-kube-api-access-sr99z\") pod \"redhat-operators-4qwnl\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:10 crc kubenswrapper[4892]: I1006 13:02:10.920695 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:11 crc kubenswrapper[4892]: I1006 13:02:11.378299 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4qwnl"] Oct 06 13:02:11 crc kubenswrapper[4892]: I1006 13:02:11.848611 4892 generic.go:334] "Generic (PLEG): container finished" podID="10970580-2f08-430b-bd81-1a2183cb0980" containerID="e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860" exitCode=0 Oct 06 13:02:11 crc kubenswrapper[4892]: I1006 13:02:11.848676 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qwnl" event={"ID":"10970580-2f08-430b-bd81-1a2183cb0980","Type":"ContainerDied","Data":"e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860"} Oct 06 13:02:11 crc kubenswrapper[4892]: I1006 13:02:11.848716 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qwnl" event={"ID":"10970580-2f08-430b-bd81-1a2183cb0980","Type":"ContainerStarted","Data":"59136872f568944ae19ac95962f8b04d6530732d1942dd43327426f085d30d15"} Oct 06 13:02:11 crc kubenswrapper[4892]: I1006 13:02:11.851615 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:02:13 crc kubenswrapper[4892]: I1006 13:02:13.169435 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:02:13 crc kubenswrapper[4892]: E1006 13:02:13.169925 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:02:13 crc kubenswrapper[4892]: I1006 13:02:13.882116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qwnl" event={"ID":"10970580-2f08-430b-bd81-1a2183cb0980","Type":"ContainerStarted","Data":"484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3"} Oct 06 13:02:19 crc kubenswrapper[4892]: I1006 13:02:19.953352 4892 generic.go:334] "Generic (PLEG): container finished" podID="10970580-2f08-430b-bd81-1a2183cb0980" containerID="484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3" exitCode=0 Oct 06 13:02:19 crc kubenswrapper[4892]: I1006 13:02:19.953436 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qwnl" event={"ID":"10970580-2f08-430b-bd81-1a2183cb0980","Type":"ContainerDied","Data":"484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3"} Oct 06 13:02:21 crc kubenswrapper[4892]: I1006 13:02:21.979439 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qwnl" event={"ID":"10970580-2f08-430b-bd81-1a2183cb0980","Type":"ContainerStarted","Data":"7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838"} Oct 06 13:02:22 crc kubenswrapper[4892]: I1006 13:02:22.017548 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4qwnl" podStartSLOduration=3.006835774 podStartE2EDuration="12.017526212s" podCreationTimestamp="2025-10-06 13:02:10 +0000 UTC" firstStartedPulling="2025-10-06 13:02:11.851266139 +0000 UTC m=+3218.400971914" lastFinishedPulling="2025-10-06 13:02:20.861956547 +0000 UTC m=+3227.411662352" observedRunningTime="2025-10-06 13:02:22.00528972 +0000 UTC m=+3228.554995495" watchObservedRunningTime="2025-10-06 13:02:22.017526212 +0000 UTC m=+3228.567231977" Oct 06 13:02:25 crc kubenswrapper[4892]: I1006 13:02:25.169121 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:02:25 crc kubenswrapper[4892]: E1006 13:02:25.170418 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:02:30 crc kubenswrapper[4892]: I1006 13:02:30.921061 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:30 crc kubenswrapper[4892]: I1006 13:02:30.921738 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:30 crc kubenswrapper[4892]: I1006 13:02:30.986570 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:31 crc kubenswrapper[4892]: I1006 13:02:31.123027 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:31 crc kubenswrapper[4892]: I1006 13:02:31.227617 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qwnl"] Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.091957 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4qwnl" podUID="10970580-2f08-430b-bd81-1a2183cb0980" containerName="registry-server" containerID="cri-o://7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838" gracePeriod=2 Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.607910 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.690184 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-utilities\") pod \"10970580-2f08-430b-bd81-1a2183cb0980\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.690388 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr99z\" (UniqueName: \"kubernetes.io/projected/10970580-2f08-430b-bd81-1a2183cb0980-kube-api-access-sr99z\") pod \"10970580-2f08-430b-bd81-1a2183cb0980\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.690407 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-catalog-content\") pod \"10970580-2f08-430b-bd81-1a2183cb0980\" (UID: \"10970580-2f08-430b-bd81-1a2183cb0980\") " Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.691691 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-utilities" (OuterVolumeSpecName: "utilities") pod "10970580-2f08-430b-bd81-1a2183cb0980" (UID: "10970580-2f08-430b-bd81-1a2183cb0980"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.695438 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10970580-2f08-430b-bd81-1a2183cb0980-kube-api-access-sr99z" (OuterVolumeSpecName: "kube-api-access-sr99z") pod "10970580-2f08-430b-bd81-1a2183cb0980" (UID: "10970580-2f08-430b-bd81-1a2183cb0980"). InnerVolumeSpecName "kube-api-access-sr99z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.785531 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10970580-2f08-430b-bd81-1a2183cb0980" (UID: "10970580-2f08-430b-bd81-1a2183cb0980"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.792398 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr99z\" (UniqueName: \"kubernetes.io/projected/10970580-2f08-430b-bd81-1a2183cb0980-kube-api-access-sr99z\") on node \"crc\" DevicePath \"\"" Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.792427 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:02:33 crc kubenswrapper[4892]: I1006 13:02:33.792437 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10970580-2f08-430b-bd81-1a2183cb0980-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.110293 4892 generic.go:334] "Generic (PLEG): container finished" podID="10970580-2f08-430b-bd81-1a2183cb0980" containerID="7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838" exitCode=0 Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.110431 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qwnl" event={"ID":"10970580-2f08-430b-bd81-1a2183cb0980","Type":"ContainerDied","Data":"7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838"} Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.110466 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4qwnl" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.110652 4892 scope.go:117] "RemoveContainer" containerID="7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.110634 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4qwnl" event={"ID":"10970580-2f08-430b-bd81-1a2183cb0980","Type":"ContainerDied","Data":"59136872f568944ae19ac95962f8b04d6530732d1942dd43327426f085d30d15"} Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.150879 4892 scope.go:117] "RemoveContainer" containerID="484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.164864 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4qwnl"] Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.188124 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4qwnl"] Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.190276 4892 scope.go:117] "RemoveContainer" containerID="e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.247536 4892 scope.go:117] "RemoveContainer" containerID="7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838" Oct 06 13:02:34 crc kubenswrapper[4892]: E1006 13:02:34.248023 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838\": container with ID starting with 7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838 not found: ID does not exist" containerID="7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.248083 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838"} err="failed to get container status \"7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838\": rpc error: code = NotFound desc = could not find container \"7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838\": container with ID starting with 7e2d5dfe5e83b382226619baa7755f6bd7a11b52f8468ea8e7a655da03ebf838 not found: ID does not exist" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.248116 4892 scope.go:117] "RemoveContainer" containerID="484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3" Oct 06 13:02:34 crc kubenswrapper[4892]: E1006 13:02:34.248718 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3\": container with ID starting with 484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3 not found: ID does not exist" containerID="484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.248761 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3"} err="failed to get container status \"484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3\": rpc error: code = NotFound desc = could not find container \"484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3\": container with ID starting with 484daf6d28dc8488fda713f33cb9d3bb4dafd59316bb990950cef82b1c750cc3 not found: ID does not exist" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.248790 4892 scope.go:117] "RemoveContainer" containerID="e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860" Oct 06 13:02:34 crc kubenswrapper[4892]: E1006 13:02:34.249345 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860\": container with ID starting with e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860 not found: ID does not exist" containerID="e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860" Oct 06 13:02:34 crc kubenswrapper[4892]: I1006 13:02:34.249378 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860"} err="failed to get container status \"e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860\": rpc error: code = NotFound desc = could not find container \"e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860\": container with ID starting with e1876c544be9940bb5a1e8144321f0af1d0d419bd46b04ac9f134eea5b5ff860 not found: ID does not exist" Oct 06 13:02:36 crc kubenswrapper[4892]: I1006 13:02:36.181232 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10970580-2f08-430b-bd81-1a2183cb0980" path="/var/lib/kubelet/pods/10970580-2f08-430b-bd81-1a2183cb0980/volumes" Oct 06 13:02:38 crc kubenswrapper[4892]: I1006 13:02:38.169859 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:02:38 crc kubenswrapper[4892]: E1006 13:02:38.170862 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:02:51 crc kubenswrapper[4892]: I1006 13:02:51.169635 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:02:51 crc kubenswrapper[4892]: E1006 13:02:51.170767 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:03:06 crc kubenswrapper[4892]: I1006 13:03:06.168549 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:03:06 crc kubenswrapper[4892]: E1006 13:03:06.169530 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:03:17 crc kubenswrapper[4892]: I1006 13:03:17.170504 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:03:17 crc kubenswrapper[4892]: E1006 13:03:17.171146 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:03:31 crc kubenswrapper[4892]: I1006 13:03:31.169853 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:03:31 crc kubenswrapper[4892]: E1006 13:03:31.171206 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:03:42 crc kubenswrapper[4892]: I1006 13:03:42.169081 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:03:42 crc kubenswrapper[4892]: E1006 13:03:42.170477 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:03:55 crc kubenswrapper[4892]: I1006 13:03:55.169183 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:03:55 crc kubenswrapper[4892]: E1006 13:03:55.170547 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:04:09 crc kubenswrapper[4892]: I1006 13:04:09.169223 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:04:09 crc kubenswrapper[4892]: E1006 13:04:09.169985 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:04:21 crc kubenswrapper[4892]: I1006 13:04:21.170066 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:04:21 crc kubenswrapper[4892]: E1006 13:04:21.171949 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:04:34 crc kubenswrapper[4892]: I1006 13:04:34.181421 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:04:34 crc kubenswrapper[4892]: E1006 13:04:34.182587 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:04:46 crc kubenswrapper[4892]: I1006 13:04:46.169203 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:04:46 crc kubenswrapper[4892]: E1006 13:04:46.170593 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:04:58 crc kubenswrapper[4892]: I1006 13:04:58.173294 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:04:58 crc kubenswrapper[4892]: E1006 13:04:58.174008 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.580639 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h2mzd"] Oct 06 13:05:04 crc kubenswrapper[4892]: E1006 13:05:04.582284 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10970580-2f08-430b-bd81-1a2183cb0980" containerName="extract-content" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.582318 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="10970580-2f08-430b-bd81-1a2183cb0980" containerName="extract-content" Oct 06 13:05:04 crc kubenswrapper[4892]: E1006 13:05:04.582379 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10970580-2f08-430b-bd81-1a2183cb0980" containerName="registry-server" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.582399 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="10970580-2f08-430b-bd81-1a2183cb0980" containerName="registry-server" Oct 06 13:05:04 crc kubenswrapper[4892]: E1006 13:05:04.582510 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10970580-2f08-430b-bd81-1a2183cb0980" containerName="extract-utilities" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.582528 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="10970580-2f08-430b-bd81-1a2183cb0980" containerName="extract-utilities" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.583069 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="10970580-2f08-430b-bd81-1a2183cb0980" containerName="registry-server" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.586945 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.612549 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2mzd"] Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.752649 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-utilities\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.752712 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29c9v\" (UniqueName: \"kubernetes.io/projected/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-kube-api-access-29c9v\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.752736 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-catalog-content\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.855286 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-utilities\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.855404 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29c9v\" (UniqueName: \"kubernetes.io/projected/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-kube-api-access-29c9v\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.855444 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-catalog-content\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.855889 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-utilities\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.856092 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-catalog-content\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.877761 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29c9v\" (UniqueName: \"kubernetes.io/projected/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-kube-api-access-29c9v\") pod \"certified-operators-h2mzd\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:04 crc kubenswrapper[4892]: I1006 13:05:04.929389 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:05 crc kubenswrapper[4892]: I1006 13:05:05.431358 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h2mzd"] Oct 06 13:05:05 crc kubenswrapper[4892]: I1006 13:05:05.938603 4892 generic.go:334] "Generic (PLEG): container finished" podID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerID="4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87" exitCode=0 Oct 06 13:05:05 crc kubenswrapper[4892]: I1006 13:05:05.939026 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2mzd" event={"ID":"3c84b9a0-4032-4bb7-855c-9b410cd0aa76","Type":"ContainerDied","Data":"4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87"} Oct 06 13:05:05 crc kubenswrapper[4892]: I1006 13:05:05.939076 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2mzd" event={"ID":"3c84b9a0-4032-4bb7-855c-9b410cd0aa76","Type":"ContainerStarted","Data":"3edc90a00ab3d56ee90a5729d9cb0bd5f8b9f77ff8e308c3a5257985de698a89"} Oct 06 13:05:07 crc kubenswrapper[4892]: I1006 13:05:07.964234 4892 generic.go:334] "Generic (PLEG): container finished" podID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerID="fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba" exitCode=0 Oct 06 13:05:07 crc kubenswrapper[4892]: I1006 13:05:07.964311 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2mzd" event={"ID":"3c84b9a0-4032-4bb7-855c-9b410cd0aa76","Type":"ContainerDied","Data":"fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba"} Oct 06 13:05:08 crc kubenswrapper[4892]: I1006 13:05:08.981193 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2mzd" event={"ID":"3c84b9a0-4032-4bb7-855c-9b410cd0aa76","Type":"ContainerStarted","Data":"ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb"} Oct 06 13:05:09 crc kubenswrapper[4892]: I1006 13:05:09.003079 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h2mzd" podStartSLOduration=2.487760136 podStartE2EDuration="5.003055047s" podCreationTimestamp="2025-10-06 13:05:04 +0000 UTC" firstStartedPulling="2025-10-06 13:05:05.942206711 +0000 UTC m=+3392.491912516" lastFinishedPulling="2025-10-06 13:05:08.457501662 +0000 UTC m=+3395.007207427" observedRunningTime="2025-10-06 13:05:08.999397462 +0000 UTC m=+3395.549103247" watchObservedRunningTime="2025-10-06 13:05:09.003055047 +0000 UTC m=+3395.552760822" Oct 06 13:05:13 crc kubenswrapper[4892]: I1006 13:05:13.169128 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:05:13 crc kubenswrapper[4892]: E1006 13:05:13.169914 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:05:14 crc kubenswrapper[4892]: I1006 13:05:14.929793 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:14 crc kubenswrapper[4892]: I1006 13:05:14.929870 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:15 crc kubenswrapper[4892]: I1006 13:05:15.001682 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:15 crc kubenswrapper[4892]: I1006 13:05:15.144922 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:15 crc kubenswrapper[4892]: I1006 13:05:15.253085 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2mzd"] Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.081003 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h2mzd" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerName="registry-server" containerID="cri-o://ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb" gracePeriod=2 Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.626953 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.765857 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29c9v\" (UniqueName: \"kubernetes.io/projected/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-kube-api-access-29c9v\") pod \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.766294 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-catalog-content\") pod \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.766596 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-utilities\") pod \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\" (UID: \"3c84b9a0-4032-4bb7-855c-9b410cd0aa76\") " Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.768055 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-utilities" (OuterVolumeSpecName: "utilities") pod "3c84b9a0-4032-4bb7-855c-9b410cd0aa76" (UID: "3c84b9a0-4032-4bb7-855c-9b410cd0aa76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.773062 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-kube-api-access-29c9v" (OuterVolumeSpecName: "kube-api-access-29c9v") pod "3c84b9a0-4032-4bb7-855c-9b410cd0aa76" (UID: "3c84b9a0-4032-4bb7-855c-9b410cd0aa76"). InnerVolumeSpecName "kube-api-access-29c9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.869742 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29c9v\" (UniqueName: \"kubernetes.io/projected/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-kube-api-access-29c9v\") on node \"crc\" DevicePath \"\"" Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.869790 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:05:17 crc kubenswrapper[4892]: I1006 13:05:17.987015 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c84b9a0-4032-4bb7-855c-9b410cd0aa76" (UID: "3c84b9a0-4032-4bb7-855c-9b410cd0aa76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.075448 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c84b9a0-4032-4bb7-855c-9b410cd0aa76-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.095861 4892 generic.go:334] "Generic (PLEG): container finished" podID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerID="ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb" exitCode=0 Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.095919 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2mzd" event={"ID":"3c84b9a0-4032-4bb7-855c-9b410cd0aa76","Type":"ContainerDied","Data":"ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb"} Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.095970 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h2mzd" event={"ID":"3c84b9a0-4032-4bb7-855c-9b410cd0aa76","Type":"ContainerDied","Data":"3edc90a00ab3d56ee90a5729d9cb0bd5f8b9f77ff8e308c3a5257985de698a89"} Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.096059 4892 scope.go:117] "RemoveContainer" containerID="ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.096952 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h2mzd" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.142579 4892 scope.go:117] "RemoveContainer" containerID="fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.154625 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h2mzd"] Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.182176 4892 scope.go:117] "RemoveContainer" containerID="4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.197044 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h2mzd"] Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.236748 4892 scope.go:117] "RemoveContainer" containerID="ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb" Oct 06 13:05:18 crc kubenswrapper[4892]: E1006 13:05:18.237336 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb\": container with ID starting with ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb not found: ID does not exist" containerID="ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.237410 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb"} err="failed to get container status \"ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb\": rpc error: code = NotFound desc = could not find container \"ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb\": container with ID starting with ebc6ec73c47d0e0d5c6335ed3e1d7a94a49fbf5dbd21c1059f5059e3ca9140eb not found: ID does not exist" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.237444 4892 scope.go:117] "RemoveContainer" containerID="fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba" Oct 06 13:05:18 crc kubenswrapper[4892]: E1006 13:05:18.238031 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba\": container with ID starting with fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba not found: ID does not exist" containerID="fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.238067 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba"} err="failed to get container status \"fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba\": rpc error: code = NotFound desc = could not find container \"fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba\": container with ID starting with fddb86b9eca72f2c625fea536152ec30079a0786c54074c77db9f4e8bacf49ba not found: ID does not exist" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.238098 4892 scope.go:117] "RemoveContainer" containerID="4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87" Oct 06 13:05:18 crc kubenswrapper[4892]: E1006 13:05:18.239300 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87\": container with ID starting with 4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87 not found: ID does not exist" containerID="4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87" Oct 06 13:05:18 crc kubenswrapper[4892]: I1006 13:05:18.239356 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87"} err="failed to get container status \"4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87\": rpc error: code = NotFound desc = could not find container \"4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87\": container with ID starting with 4aff89d2abe0f6dae52606e2f9b59682fb5019104dd773becda23488ce8bba87 not found: ID does not exist" Oct 06 13:05:20 crc kubenswrapper[4892]: I1006 13:05:20.183159 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" path="/var/lib/kubelet/pods/3c84b9a0-4032-4bb7-855c-9b410cd0aa76/volumes" Oct 06 13:05:24 crc kubenswrapper[4892]: I1006 13:05:24.182767 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:05:24 crc kubenswrapper[4892]: E1006 13:05:24.183609 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:05:38 crc kubenswrapper[4892]: I1006 13:05:38.168839 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:05:38 crc kubenswrapper[4892]: E1006 13:05:38.169833 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:05:53 crc kubenswrapper[4892]: I1006 13:05:53.168484 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:05:53 crc kubenswrapper[4892]: I1006 13:05:53.474185 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"eb6af8cff51300447d2882e9902e5ccb316bb63aeb4c54b4164f50ad2f842d7c"} Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.696774 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tbtkk"] Oct 06 13:06:14 crc kubenswrapper[4892]: E1006 13:06:14.698433 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerName="extract-content" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.698471 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerName="extract-content" Oct 06 13:06:14 crc kubenswrapper[4892]: E1006 13:06:14.698519 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerName="registry-server" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.698538 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerName="registry-server" Oct 06 13:06:14 crc kubenswrapper[4892]: E1006 13:06:14.698636 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerName="extract-utilities" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.698655 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerName="extract-utilities" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.699211 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c84b9a0-4032-4bb7-855c-9b410cd0aa76" containerName="registry-server" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.701653 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.711803 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbtkk"] Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.780486 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-utilities\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.780973 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwxgq\" (UniqueName: \"kubernetes.io/projected/bc87b791-eff8-47b3-afae-585d4e4f4f01-kube-api-access-wwxgq\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.781003 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-catalog-content\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.882392 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-utilities\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.882633 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwxgq\" (UniqueName: \"kubernetes.io/projected/bc87b791-eff8-47b3-afae-585d4e4f4f01-kube-api-access-wwxgq\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.882672 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-catalog-content\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.882925 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-utilities\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.883139 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-catalog-content\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:14 crc kubenswrapper[4892]: I1006 13:06:14.902265 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwxgq\" (UniqueName: \"kubernetes.io/projected/bc87b791-eff8-47b3-afae-585d4e4f4f01-kube-api-access-wwxgq\") pod \"redhat-marketplace-tbtkk\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:15 crc kubenswrapper[4892]: I1006 13:06:15.040871 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:15 crc kubenswrapper[4892]: I1006 13:06:15.519281 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbtkk"] Oct 06 13:06:15 crc kubenswrapper[4892]: I1006 13:06:15.712640 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbtkk" event={"ID":"bc87b791-eff8-47b3-afae-585d4e4f4f01","Type":"ContainerStarted","Data":"51d69fb534cdd85546bc3a92b700ef67ba7989bc459db7a3964b41caf282f4fc"} Oct 06 13:06:16 crc kubenswrapper[4892]: I1006 13:06:16.731554 4892 generic.go:334] "Generic (PLEG): container finished" podID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerID="d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe" exitCode=0 Oct 06 13:06:16 crc kubenswrapper[4892]: I1006 13:06:16.731730 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbtkk" event={"ID":"bc87b791-eff8-47b3-afae-585d4e4f4f01","Type":"ContainerDied","Data":"d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe"} Oct 06 13:06:17 crc kubenswrapper[4892]: I1006 13:06:17.757064 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbtkk" event={"ID":"bc87b791-eff8-47b3-afae-585d4e4f4f01","Type":"ContainerStarted","Data":"fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e"} Oct 06 13:06:18 crc kubenswrapper[4892]: I1006 13:06:18.772613 4892 generic.go:334] "Generic (PLEG): container finished" podID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerID="fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e" exitCode=0 Oct 06 13:06:18 crc kubenswrapper[4892]: I1006 13:06:18.773117 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbtkk" event={"ID":"bc87b791-eff8-47b3-afae-585d4e4f4f01","Type":"ContainerDied","Data":"fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e"} Oct 06 13:06:19 crc kubenswrapper[4892]: I1006 13:06:19.788610 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbtkk" event={"ID":"bc87b791-eff8-47b3-afae-585d4e4f4f01","Type":"ContainerStarted","Data":"76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d"} Oct 06 13:06:19 crc kubenswrapper[4892]: I1006 13:06:19.810482 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tbtkk" podStartSLOduration=3.361160351 podStartE2EDuration="5.810462746s" podCreationTimestamp="2025-10-06 13:06:14 +0000 UTC" firstStartedPulling="2025-10-06 13:06:16.73460943 +0000 UTC m=+3463.284315195" lastFinishedPulling="2025-10-06 13:06:19.183911795 +0000 UTC m=+3465.733617590" observedRunningTime="2025-10-06 13:06:19.809463047 +0000 UTC m=+3466.359168832" watchObservedRunningTime="2025-10-06 13:06:19.810462746 +0000 UTC m=+3466.360168511" Oct 06 13:06:25 crc kubenswrapper[4892]: I1006 13:06:25.042083 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:25 crc kubenswrapper[4892]: I1006 13:06:25.043079 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:25 crc kubenswrapper[4892]: I1006 13:06:25.132314 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:25 crc kubenswrapper[4892]: I1006 13:06:25.947308 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:26 crc kubenswrapper[4892]: I1006 13:06:26.025590 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbtkk"] Oct 06 13:06:27 crc kubenswrapper[4892]: I1006 13:06:27.889546 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tbtkk" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerName="registry-server" containerID="cri-o://76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d" gracePeriod=2 Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.363908 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.388303 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwxgq\" (UniqueName: \"kubernetes.io/projected/bc87b791-eff8-47b3-afae-585d4e4f4f01-kube-api-access-wwxgq\") pod \"bc87b791-eff8-47b3-afae-585d4e4f4f01\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.388527 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-utilities\") pod \"bc87b791-eff8-47b3-afae-585d4e4f4f01\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.388731 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-catalog-content\") pod \"bc87b791-eff8-47b3-afae-585d4e4f4f01\" (UID: \"bc87b791-eff8-47b3-afae-585d4e4f4f01\") " Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.390267 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-utilities" (OuterVolumeSpecName: "utilities") pod "bc87b791-eff8-47b3-afae-585d4e4f4f01" (UID: "bc87b791-eff8-47b3-afae-585d4e4f4f01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.401564 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc87b791-eff8-47b3-afae-585d4e4f4f01-kube-api-access-wwxgq" (OuterVolumeSpecName: "kube-api-access-wwxgq") pod "bc87b791-eff8-47b3-afae-585d4e4f4f01" (UID: "bc87b791-eff8-47b3-afae-585d4e4f4f01"). InnerVolumeSpecName "kube-api-access-wwxgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.413259 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc87b791-eff8-47b3-afae-585d4e4f4f01" (UID: "bc87b791-eff8-47b3-afae-585d4e4f4f01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.492288 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwxgq\" (UniqueName: \"kubernetes.io/projected/bc87b791-eff8-47b3-afae-585d4e4f4f01-kube-api-access-wwxgq\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.492579 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.492589 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc87b791-eff8-47b3-afae-585d4e4f4f01-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.909707 4892 generic.go:334] "Generic (PLEG): container finished" podID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerID="76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d" exitCode=0 Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.909791 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tbtkk" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.909815 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbtkk" event={"ID":"bc87b791-eff8-47b3-afae-585d4e4f4f01","Type":"ContainerDied","Data":"76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d"} Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.911273 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tbtkk" event={"ID":"bc87b791-eff8-47b3-afae-585d4e4f4f01","Type":"ContainerDied","Data":"51d69fb534cdd85546bc3a92b700ef67ba7989bc459db7a3964b41caf282f4fc"} Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.911309 4892 scope.go:117] "RemoveContainer" containerID="76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.959881 4892 scope.go:117] "RemoveContainer" containerID="fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e" Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.970824 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbtkk"] Oct 06 13:06:28 crc kubenswrapper[4892]: I1006 13:06:28.989042 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tbtkk"] Oct 06 13:06:29 crc kubenswrapper[4892]: I1006 13:06:29.003439 4892 scope.go:117] "RemoveContainer" containerID="d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe" Oct 06 13:06:29 crc kubenswrapper[4892]: I1006 13:06:29.085345 4892 scope.go:117] "RemoveContainer" containerID="76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d" Oct 06 13:06:29 crc kubenswrapper[4892]: E1006 13:06:29.086091 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d\": container with ID starting with 76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d not found: ID does not exist" containerID="76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d" Oct 06 13:06:29 crc kubenswrapper[4892]: I1006 13:06:29.086136 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d"} err="failed to get container status \"76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d\": rpc error: code = NotFound desc = could not find container \"76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d\": container with ID starting with 76d2c8511400a3b6af9924acb592ffcd5fa388d22cc7cfa4944f90103a7fd02d not found: ID does not exist" Oct 06 13:06:29 crc kubenswrapper[4892]: I1006 13:06:29.086167 4892 scope.go:117] "RemoveContainer" containerID="fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e" Oct 06 13:06:29 crc kubenswrapper[4892]: E1006 13:06:29.086542 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e\": container with ID starting with fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e not found: ID does not exist" containerID="fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e" Oct 06 13:06:29 crc kubenswrapper[4892]: I1006 13:06:29.086567 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e"} err="failed to get container status \"fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e\": rpc error: code = NotFound desc = could not find container \"fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e\": container with ID starting with fac37790b1ffc11262abbb1be0622e060aa6b870d10ff9aa5a9fca7f7b3fc67e not found: ID does not exist" Oct 06 13:06:29 crc kubenswrapper[4892]: I1006 13:06:29.086583 4892 scope.go:117] "RemoveContainer" containerID="d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe" Oct 06 13:06:29 crc kubenswrapper[4892]: E1006 13:06:29.086981 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe\": container with ID starting with d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe not found: ID does not exist" containerID="d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe" Oct 06 13:06:29 crc kubenswrapper[4892]: I1006 13:06:29.087010 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe"} err="failed to get container status \"d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe\": rpc error: code = NotFound desc = could not find container \"d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe\": container with ID starting with d6c5d7d70cca09286d5ed58e3899870518b0ec2136d34856bd06b0e06d1996fe not found: ID does not exist" Oct 06 13:06:30 crc kubenswrapper[4892]: I1006 13:06:30.187103 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" path="/var/lib/kubelet/pods/bc87b791-eff8-47b3-afae-585d4e4f4f01/volumes" Oct 06 13:07:59 crc kubenswrapper[4892]: I1006 13:07:59.156595 4892 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-57bbd8d677-4mwpb" podUID="9b16ec0c-fdde-42a8-9a45-da67ecd56360" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 06 13:08:22 crc kubenswrapper[4892]: I1006 13:08:22.984433 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:08:22 crc kubenswrapper[4892]: I1006 13:08:22.985194 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:08:52 crc kubenswrapper[4892]: I1006 13:08:52.984552 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:08:52 crc kubenswrapper[4892]: I1006 13:08:52.985242 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:09:22 crc kubenswrapper[4892]: I1006 13:09:22.984510 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:09:22 crc kubenswrapper[4892]: I1006 13:09:22.985493 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:09:22 crc kubenswrapper[4892]: I1006 13:09:22.985627 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:09:22 crc kubenswrapper[4892]: I1006 13:09:22.987816 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb6af8cff51300447d2882e9902e5ccb316bb63aeb4c54b4164f50ad2f842d7c"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:09:22 crc kubenswrapper[4892]: I1006 13:09:22.987934 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://eb6af8cff51300447d2882e9902e5ccb316bb63aeb4c54b4164f50ad2f842d7c" gracePeriod=600 Oct 06 13:09:23 crc kubenswrapper[4892]: I1006 13:09:23.870656 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="eb6af8cff51300447d2882e9902e5ccb316bb63aeb4c54b4164f50ad2f842d7c" exitCode=0 Oct 06 13:09:23 crc kubenswrapper[4892]: I1006 13:09:23.870822 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"eb6af8cff51300447d2882e9902e5ccb316bb63aeb4c54b4164f50ad2f842d7c"} Oct 06 13:09:23 crc kubenswrapper[4892]: I1006 13:09:23.871352 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504"} Oct 06 13:09:23 crc kubenswrapper[4892]: I1006 13:09:23.871385 4892 scope.go:117] "RemoveContainer" containerID="21ff2353f5e3898d521a04329246f1c56bd9b4ea653d4bbf259a433285447264" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.827893 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mcl6j"] Oct 06 13:09:58 crc kubenswrapper[4892]: E1006 13:09:58.830590 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerName="extract-content" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.830612 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerName="extract-content" Oct 06 13:09:58 crc kubenswrapper[4892]: E1006 13:09:58.830652 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerName="extract-utilities" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.830661 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerName="extract-utilities" Oct 06 13:09:58 crc kubenswrapper[4892]: E1006 13:09:58.830679 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerName="registry-server" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.830687 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerName="registry-server" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.830947 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc87b791-eff8-47b3-afae-585d4e4f4f01" containerName="registry-server" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.835269 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.843639 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mcl6j"] Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.989339 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-catalog-content\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.989705 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmx8w\" (UniqueName: \"kubernetes.io/projected/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-kube-api-access-hmx8w\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:58 crc kubenswrapper[4892]: I1006 13:09:58.990088 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-utilities\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:59 crc kubenswrapper[4892]: I1006 13:09:59.091666 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-catalog-content\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:59 crc kubenswrapper[4892]: I1006 13:09:59.092055 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmx8w\" (UniqueName: \"kubernetes.io/projected/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-kube-api-access-hmx8w\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:59 crc kubenswrapper[4892]: I1006 13:09:59.092158 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-utilities\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:59 crc kubenswrapper[4892]: I1006 13:09:59.092721 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-catalog-content\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:59 crc kubenswrapper[4892]: I1006 13:09:59.092993 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-utilities\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:59 crc kubenswrapper[4892]: I1006 13:09:59.119819 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmx8w\" (UniqueName: \"kubernetes.io/projected/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-kube-api-access-hmx8w\") pod \"community-operators-mcl6j\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:59 crc kubenswrapper[4892]: I1006 13:09:59.166455 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:09:59 crc kubenswrapper[4892]: I1006 13:09:59.724210 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mcl6j"] Oct 06 13:10:00 crc kubenswrapper[4892]: I1006 13:10:00.340312 4892 generic.go:334] "Generic (PLEG): container finished" podID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerID="fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068" exitCode=0 Oct 06 13:10:00 crc kubenswrapper[4892]: I1006 13:10:00.340424 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcl6j" event={"ID":"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8","Type":"ContainerDied","Data":"fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068"} Oct 06 13:10:00 crc kubenswrapper[4892]: I1006 13:10:00.340701 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcl6j" event={"ID":"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8","Type":"ContainerStarted","Data":"525168f0bacf01d46d85f80eb15911c5a328629d9d70a1be3b985ad4af2cbf1f"} Oct 06 13:10:00 crc kubenswrapper[4892]: I1006 13:10:00.344843 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:10:02 crc kubenswrapper[4892]: I1006 13:10:02.365686 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcl6j" event={"ID":"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8","Type":"ContainerStarted","Data":"8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1"} Oct 06 13:10:03 crc kubenswrapper[4892]: I1006 13:10:03.377052 4892 generic.go:334] "Generic (PLEG): container finished" podID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerID="8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1" exitCode=0 Oct 06 13:10:03 crc kubenswrapper[4892]: I1006 13:10:03.377094 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcl6j" event={"ID":"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8","Type":"ContainerDied","Data":"8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1"} Oct 06 13:10:04 crc kubenswrapper[4892]: I1006 13:10:04.390602 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcl6j" event={"ID":"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8","Type":"ContainerStarted","Data":"c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da"} Oct 06 13:10:09 crc kubenswrapper[4892]: I1006 13:10:09.167877 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:10:09 crc kubenswrapper[4892]: I1006 13:10:09.168764 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:10:09 crc kubenswrapper[4892]: I1006 13:10:09.241289 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:10:09 crc kubenswrapper[4892]: I1006 13:10:09.266209 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mcl6j" podStartSLOduration=7.625538183 podStartE2EDuration="11.266191351s" podCreationTimestamp="2025-10-06 13:09:58 +0000 UTC" firstStartedPulling="2025-10-06 13:10:00.344472522 +0000 UTC m=+3686.894178327" lastFinishedPulling="2025-10-06 13:10:03.98512573 +0000 UTC m=+3690.534831495" observedRunningTime="2025-10-06 13:10:04.411004028 +0000 UTC m=+3690.960709793" watchObservedRunningTime="2025-10-06 13:10:09.266191351 +0000 UTC m=+3695.815897126" Oct 06 13:10:09 crc kubenswrapper[4892]: I1006 13:10:09.502465 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:10:09 crc kubenswrapper[4892]: I1006 13:10:09.554208 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mcl6j"] Oct 06 13:10:11 crc kubenswrapper[4892]: I1006 13:10:11.459401 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mcl6j" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerName="registry-server" containerID="cri-o://c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da" gracePeriod=2 Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.009642 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.074014 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-catalog-content\") pod \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.074094 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmx8w\" (UniqueName: \"kubernetes.io/projected/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-kube-api-access-hmx8w\") pod \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.074126 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-utilities\") pod \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\" (UID: \"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8\") " Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.075107 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-utilities" (OuterVolumeSpecName: "utilities") pod "8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" (UID: "8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.081194 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-kube-api-access-hmx8w" (OuterVolumeSpecName: "kube-api-access-hmx8w") pod "8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" (UID: "8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8"). InnerVolumeSpecName "kube-api-access-hmx8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.127377 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" (UID: "8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.177036 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.177069 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmx8w\" (UniqueName: \"kubernetes.io/projected/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-kube-api-access-hmx8w\") on node \"crc\" DevicePath \"\"" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.177085 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.477036 4892 generic.go:334] "Generic (PLEG): container finished" podID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerID="c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da" exitCode=0 Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.477116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcl6j" event={"ID":"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8","Type":"ContainerDied","Data":"c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da"} Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.477166 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcl6j" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.477184 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcl6j" event={"ID":"8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8","Type":"ContainerDied","Data":"525168f0bacf01d46d85f80eb15911c5a328629d9d70a1be3b985ad4af2cbf1f"} Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.477218 4892 scope.go:117] "RemoveContainer" containerID="c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.510345 4892 scope.go:117] "RemoveContainer" containerID="8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.517689 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mcl6j"] Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.532128 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mcl6j"] Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.567398 4892 scope.go:117] "RemoveContainer" containerID="fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.597887 4892 scope.go:117] "RemoveContainer" containerID="c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da" Oct 06 13:10:12 crc kubenswrapper[4892]: E1006 13:10:12.598382 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da\": container with ID starting with c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da not found: ID does not exist" containerID="c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.598417 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da"} err="failed to get container status \"c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da\": rpc error: code = NotFound desc = could not find container \"c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da\": container with ID starting with c5bacdfc5fdbee532e64f0bf95fd7cc3ff346d00f5b2c4ee04e262624e1770da not found: ID does not exist" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.598445 4892 scope.go:117] "RemoveContainer" containerID="8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1" Oct 06 13:10:12 crc kubenswrapper[4892]: E1006 13:10:12.598885 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1\": container with ID starting with 8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1 not found: ID does not exist" containerID="8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.598912 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1"} err="failed to get container status \"8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1\": rpc error: code = NotFound desc = could not find container \"8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1\": container with ID starting with 8dcf2c6449853c340350e6d52c52188573895a8825a9aaf644855360972e47d1 not found: ID does not exist" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.598928 4892 scope.go:117] "RemoveContainer" containerID="fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068" Oct 06 13:10:12 crc kubenswrapper[4892]: E1006 13:10:12.603653 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068\": container with ID starting with fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068 not found: ID does not exist" containerID="fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068" Oct 06 13:10:12 crc kubenswrapper[4892]: I1006 13:10:12.603721 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068"} err="failed to get container status \"fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068\": rpc error: code = NotFound desc = could not find container \"fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068\": container with ID starting with fc48ae4a07736e9dc4bc6d0d2303967405277ac81180f3432b5de99e4478e068 not found: ID does not exist" Oct 06 13:10:14 crc kubenswrapper[4892]: I1006 13:10:14.184137 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" path="/var/lib/kubelet/pods/8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8/volumes" Oct 06 13:11:52 crc kubenswrapper[4892]: I1006 13:11:52.984126 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:11:52 crc kubenswrapper[4892]: I1006 13:11:52.984808 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:12:22 crc kubenswrapper[4892]: I1006 13:12:22.984291 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:12:22 crc kubenswrapper[4892]: I1006 13:12:22.984942 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:12:52 crc kubenswrapper[4892]: I1006 13:12:52.984219 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:12:52 crc kubenswrapper[4892]: I1006 13:12:52.984924 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:12:52 crc kubenswrapper[4892]: I1006 13:12:52.984993 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:12:52 crc kubenswrapper[4892]: I1006 13:12:52.986015 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:12:52 crc kubenswrapper[4892]: I1006 13:12:52.986100 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" gracePeriod=600 Oct 06 13:12:53 crc kubenswrapper[4892]: E1006 13:12:53.159454 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:12:53 crc kubenswrapper[4892]: I1006 13:12:53.434084 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" exitCode=0 Oct 06 13:12:53 crc kubenswrapper[4892]: I1006 13:12:53.434148 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504"} Oct 06 13:12:53 crc kubenswrapper[4892]: I1006 13:12:53.434185 4892 scope.go:117] "RemoveContainer" containerID="eb6af8cff51300447d2882e9902e5ccb316bb63aeb4c54b4164f50ad2f842d7c" Oct 06 13:12:53 crc kubenswrapper[4892]: I1006 13:12:53.435541 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:12:53 crc kubenswrapper[4892]: E1006 13:12:53.436195 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:12:57 crc kubenswrapper[4892]: I1006 13:12:57.935763 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6ltk5"] Oct 06 13:12:57 crc kubenswrapper[4892]: E1006 13:12:57.937053 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerName="extract-utilities" Oct 06 13:12:57 crc kubenswrapper[4892]: I1006 13:12:57.937070 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerName="extract-utilities" Oct 06 13:12:57 crc kubenswrapper[4892]: E1006 13:12:57.937102 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerName="extract-content" Oct 06 13:12:57 crc kubenswrapper[4892]: I1006 13:12:57.937110 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerName="extract-content" Oct 06 13:12:57 crc kubenswrapper[4892]: E1006 13:12:57.937124 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerName="registry-server" Oct 06 13:12:57 crc kubenswrapper[4892]: I1006 13:12:57.937134 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerName="registry-server" Oct 06 13:12:57 crc kubenswrapper[4892]: I1006 13:12:57.939379 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b12c7dd-d4d4-4932-9ce4-0adbaa78e7b8" containerName="registry-server" Oct 06 13:12:57 crc kubenswrapper[4892]: I1006 13:12:57.942343 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:57 crc kubenswrapper[4892]: I1006 13:12:57.957014 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ltk5"] Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.025690 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-utilities\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.025748 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-catalog-content\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.025818 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q789g\" (UniqueName: \"kubernetes.io/projected/782b19a3-7932-418f-8d04-22cfdaebfb38-kube-api-access-q789g\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.127678 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-utilities\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.127750 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-catalog-content\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.127842 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q789g\" (UniqueName: \"kubernetes.io/projected/782b19a3-7932-418f-8d04-22cfdaebfb38-kube-api-access-q789g\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.128371 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-catalog-content\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.128390 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-utilities\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.176197 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q789g\" (UniqueName: \"kubernetes.io/projected/782b19a3-7932-418f-8d04-22cfdaebfb38-kube-api-access-q789g\") pod \"redhat-operators-6ltk5\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.277269 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:12:58 crc kubenswrapper[4892]: I1006 13:12:58.871469 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ltk5"] Oct 06 13:12:58 crc kubenswrapper[4892]: W1006 13:12:58.883760 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782b19a3_7932_418f_8d04_22cfdaebfb38.slice/crio-7c17a4b094b952aabb998cc34ced39b31c3f292e8b94d8d9a6d95b30ed3525fd WatchSource:0}: Error finding container 7c17a4b094b952aabb998cc34ced39b31c3f292e8b94d8d9a6d95b30ed3525fd: Status 404 returned error can't find the container with id 7c17a4b094b952aabb998cc34ced39b31c3f292e8b94d8d9a6d95b30ed3525fd Oct 06 13:12:59 crc kubenswrapper[4892]: I1006 13:12:59.554728 4892 generic.go:334] "Generic (PLEG): container finished" podID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerID="74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b" exitCode=0 Oct 06 13:12:59 crc kubenswrapper[4892]: I1006 13:12:59.554793 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ltk5" event={"ID":"782b19a3-7932-418f-8d04-22cfdaebfb38","Type":"ContainerDied","Data":"74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b"} Oct 06 13:12:59 crc kubenswrapper[4892]: I1006 13:12:59.555130 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ltk5" event={"ID":"782b19a3-7932-418f-8d04-22cfdaebfb38","Type":"ContainerStarted","Data":"7c17a4b094b952aabb998cc34ced39b31c3f292e8b94d8d9a6d95b30ed3525fd"} Oct 06 13:13:01 crc kubenswrapper[4892]: I1006 13:13:01.581377 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ltk5" event={"ID":"782b19a3-7932-418f-8d04-22cfdaebfb38","Type":"ContainerStarted","Data":"4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a"} Oct 06 13:13:02 crc kubenswrapper[4892]: I1006 13:13:02.594406 4892 generic.go:334] "Generic (PLEG): container finished" podID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerID="4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a" exitCode=0 Oct 06 13:13:02 crc kubenswrapper[4892]: I1006 13:13:02.594521 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ltk5" event={"ID":"782b19a3-7932-418f-8d04-22cfdaebfb38","Type":"ContainerDied","Data":"4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a"} Oct 06 13:13:03 crc kubenswrapper[4892]: I1006 13:13:03.604591 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ltk5" event={"ID":"782b19a3-7932-418f-8d04-22cfdaebfb38","Type":"ContainerStarted","Data":"b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b"} Oct 06 13:13:03 crc kubenswrapper[4892]: I1006 13:13:03.628873 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6ltk5" podStartSLOduration=2.9045072960000002 podStartE2EDuration="6.628853249s" podCreationTimestamp="2025-10-06 13:12:57 +0000 UTC" firstStartedPulling="2025-10-06 13:12:59.557192634 +0000 UTC m=+3866.106898409" lastFinishedPulling="2025-10-06 13:13:03.281538597 +0000 UTC m=+3869.831244362" observedRunningTime="2025-10-06 13:13:03.623340679 +0000 UTC m=+3870.173046444" watchObservedRunningTime="2025-10-06 13:13:03.628853249 +0000 UTC m=+3870.178559014" Oct 06 13:13:07 crc kubenswrapper[4892]: I1006 13:13:07.168576 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:13:07 crc kubenswrapper[4892]: E1006 13:13:07.169350 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:13:08 crc kubenswrapper[4892]: I1006 13:13:08.277933 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:13:08 crc kubenswrapper[4892]: I1006 13:13:08.278207 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:13:09 crc kubenswrapper[4892]: I1006 13:13:09.327924 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6ltk5" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="registry-server" probeResult="failure" output=< Oct 06 13:13:09 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Oct 06 13:13:09 crc kubenswrapper[4892]: > Oct 06 13:13:18 crc kubenswrapper[4892]: I1006 13:13:18.338896 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:13:18 crc kubenswrapper[4892]: I1006 13:13:18.394927 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:13:18 crc kubenswrapper[4892]: I1006 13:13:18.579251 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ltk5"] Oct 06 13:13:19 crc kubenswrapper[4892]: I1006 13:13:19.794533 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6ltk5" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="registry-server" containerID="cri-o://b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b" gracePeriod=2 Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.305352 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.432238 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q789g\" (UniqueName: \"kubernetes.io/projected/782b19a3-7932-418f-8d04-22cfdaebfb38-kube-api-access-q789g\") pod \"782b19a3-7932-418f-8d04-22cfdaebfb38\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.432478 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-utilities\") pod \"782b19a3-7932-418f-8d04-22cfdaebfb38\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.432588 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-catalog-content\") pod \"782b19a3-7932-418f-8d04-22cfdaebfb38\" (UID: \"782b19a3-7932-418f-8d04-22cfdaebfb38\") " Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.434236 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-utilities" (OuterVolumeSpecName: "utilities") pod "782b19a3-7932-418f-8d04-22cfdaebfb38" (UID: "782b19a3-7932-418f-8d04-22cfdaebfb38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.437984 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782b19a3-7932-418f-8d04-22cfdaebfb38-kube-api-access-q789g" (OuterVolumeSpecName: "kube-api-access-q789g") pod "782b19a3-7932-418f-8d04-22cfdaebfb38" (UID: "782b19a3-7932-418f-8d04-22cfdaebfb38"). InnerVolumeSpecName "kube-api-access-q789g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.515573 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "782b19a3-7932-418f-8d04-22cfdaebfb38" (UID: "782b19a3-7932-418f-8d04-22cfdaebfb38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.536493 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q789g\" (UniqueName: \"kubernetes.io/projected/782b19a3-7932-418f-8d04-22cfdaebfb38-kube-api-access-q789g\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.536545 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.536569 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/782b19a3-7932-418f-8d04-22cfdaebfb38-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.805845 4892 generic.go:334] "Generic (PLEG): container finished" podID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerID="b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b" exitCode=0 Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.805886 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ltk5" event={"ID":"782b19a3-7932-418f-8d04-22cfdaebfb38","Type":"ContainerDied","Data":"b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b"} Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.805936 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ltk5" event={"ID":"782b19a3-7932-418f-8d04-22cfdaebfb38","Type":"ContainerDied","Data":"7c17a4b094b952aabb998cc34ced39b31c3f292e8b94d8d9a6d95b30ed3525fd"} Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.805962 4892 scope.go:117] "RemoveContainer" containerID="b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.805901 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ltk5" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.825613 4892 scope.go:117] "RemoveContainer" containerID="4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.843998 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ltk5"] Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.852122 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6ltk5"] Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.872681 4892 scope.go:117] "RemoveContainer" containerID="74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.910480 4892 scope.go:117] "RemoveContainer" containerID="b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b" Oct 06 13:13:20 crc kubenswrapper[4892]: E1006 13:13:20.911076 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b\": container with ID starting with b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b not found: ID does not exist" containerID="b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.911136 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b"} err="failed to get container status \"b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b\": rpc error: code = NotFound desc = could not find container \"b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b\": container with ID starting with b8bf599392f8a589344950ff496a003bffbfbe8ae486883241fe867dac89dd9b not found: ID does not exist" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.911171 4892 scope.go:117] "RemoveContainer" containerID="4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a" Oct 06 13:13:20 crc kubenswrapper[4892]: E1006 13:13:20.911634 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a\": container with ID starting with 4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a not found: ID does not exist" containerID="4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.911671 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a"} err="failed to get container status \"4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a\": rpc error: code = NotFound desc = could not find container \"4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a\": container with ID starting with 4f558f2321c7bd6e3118c2a032b95268ce03eba9b70db5a35235538fc854a63a not found: ID does not exist" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.911711 4892 scope.go:117] "RemoveContainer" containerID="74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b" Oct 06 13:13:20 crc kubenswrapper[4892]: E1006 13:13:20.912239 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b\": container with ID starting with 74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b not found: ID does not exist" containerID="74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b" Oct 06 13:13:20 crc kubenswrapper[4892]: I1006 13:13:20.912268 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b"} err="failed to get container status \"74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b\": rpc error: code = NotFound desc = could not find container \"74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b\": container with ID starting with 74a2f87beb0c3a8e02b5e110320fd6e9ddd280c9e626056f9d9e8f2d22eea16b not found: ID does not exist" Oct 06 13:13:22 crc kubenswrapper[4892]: I1006 13:13:22.168906 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:13:22 crc kubenswrapper[4892]: E1006 13:13:22.169421 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:13:22 crc kubenswrapper[4892]: I1006 13:13:22.189013 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" path="/var/lib/kubelet/pods/782b19a3-7932-418f-8d04-22cfdaebfb38/volumes" Oct 06 13:13:33 crc kubenswrapper[4892]: I1006 13:13:33.169461 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:13:33 crc kubenswrapper[4892]: E1006 13:13:33.170642 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:13:44 crc kubenswrapper[4892]: I1006 13:13:44.181526 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:13:44 crc kubenswrapper[4892]: E1006 13:13:44.182806 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:13:59 crc kubenswrapper[4892]: I1006 13:13:59.169223 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:13:59 crc kubenswrapper[4892]: E1006 13:13:59.171390 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:14:13 crc kubenswrapper[4892]: I1006 13:14:13.168973 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:14:13 crc kubenswrapper[4892]: E1006 13:14:13.169990 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:14:27 crc kubenswrapper[4892]: I1006 13:14:27.169148 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:14:27 crc kubenswrapper[4892]: E1006 13:14:27.171038 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:14:39 crc kubenswrapper[4892]: I1006 13:14:39.169552 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:14:39 crc kubenswrapper[4892]: E1006 13:14:39.170576 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:14:51 crc kubenswrapper[4892]: I1006 13:14:51.169131 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:14:51 crc kubenswrapper[4892]: E1006 13:14:51.170367 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.157090 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk"] Oct 06 13:15:00 crc kubenswrapper[4892]: E1006 13:15:00.158051 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="extract-content" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.158068 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="extract-content" Oct 06 13:15:00 crc kubenswrapper[4892]: E1006 13:15:00.158089 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="registry-server" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.158097 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="registry-server" Oct 06 13:15:00 crc kubenswrapper[4892]: E1006 13:15:00.158124 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="extract-utilities" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.158132 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="extract-utilities" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.158419 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="782b19a3-7932-418f-8d04-22cfdaebfb38" containerName="registry-server" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.159296 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.162161 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.164874 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.186581 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk"] Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.249138 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56eb2a3f-a505-4016-9e9d-9fadd306f540-secret-volume\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.249272 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/56eb2a3f-a505-4016-9e9d-9fadd306f540-kube-api-access-5bxh5\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.249315 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56eb2a3f-a505-4016-9e9d-9fadd306f540-config-volume\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.350670 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/56eb2a3f-a505-4016-9e9d-9fadd306f540-kube-api-access-5bxh5\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.350745 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56eb2a3f-a505-4016-9e9d-9fadd306f540-config-volume\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.350873 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56eb2a3f-a505-4016-9e9d-9fadd306f540-secret-volume\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.352508 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56eb2a3f-a505-4016-9e9d-9fadd306f540-config-volume\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.356819 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56eb2a3f-a505-4016-9e9d-9fadd306f540-secret-volume\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.371447 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/56eb2a3f-a505-4016-9e9d-9fadd306f540-kube-api-access-5bxh5\") pod \"collect-profiles-29329275-mrxlk\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.495704 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:00 crc kubenswrapper[4892]: I1006 13:15:00.984135 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk"] Oct 06 13:15:00 crc kubenswrapper[4892]: W1006 13:15:00.993603 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56eb2a3f_a505_4016_9e9d_9fadd306f540.slice/crio-181d3d86cd79eeafceac4620562fe0302f6edb261b47133efcfc0ecd31b22a5c WatchSource:0}: Error finding container 181d3d86cd79eeafceac4620562fe0302f6edb261b47133efcfc0ecd31b22a5c: Status 404 returned error can't find the container with id 181d3d86cd79eeafceac4620562fe0302f6edb261b47133efcfc0ecd31b22a5c Oct 06 13:15:01 crc kubenswrapper[4892]: I1006 13:15:01.948684 4892 generic.go:334] "Generic (PLEG): container finished" podID="56eb2a3f-a505-4016-9e9d-9fadd306f540" containerID="05510eab11c4a7352b0a8bce1de5e685634e8042913fc492eda8f8c3148d0e53" exitCode=0 Oct 06 13:15:01 crc kubenswrapper[4892]: I1006 13:15:01.948760 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" event={"ID":"56eb2a3f-a505-4016-9e9d-9fadd306f540","Type":"ContainerDied","Data":"05510eab11c4a7352b0a8bce1de5e685634e8042913fc492eda8f8c3148d0e53"} Oct 06 13:15:01 crc kubenswrapper[4892]: I1006 13:15:01.948965 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" event={"ID":"56eb2a3f-a505-4016-9e9d-9fadd306f540","Type":"ContainerStarted","Data":"181d3d86cd79eeafceac4620562fe0302f6edb261b47133efcfc0ecd31b22a5c"} Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.312211 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.414700 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56eb2a3f-a505-4016-9e9d-9fadd306f540-secret-volume\") pod \"56eb2a3f-a505-4016-9e9d-9fadd306f540\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.414787 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/56eb2a3f-a505-4016-9e9d-9fadd306f540-kube-api-access-5bxh5\") pod \"56eb2a3f-a505-4016-9e9d-9fadd306f540\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.414843 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56eb2a3f-a505-4016-9e9d-9fadd306f540-config-volume\") pod \"56eb2a3f-a505-4016-9e9d-9fadd306f540\" (UID: \"56eb2a3f-a505-4016-9e9d-9fadd306f540\") " Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.415570 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56eb2a3f-a505-4016-9e9d-9fadd306f540-config-volume" (OuterVolumeSpecName: "config-volume") pod "56eb2a3f-a505-4016-9e9d-9fadd306f540" (UID: "56eb2a3f-a505-4016-9e9d-9fadd306f540"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.420085 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56eb2a3f-a505-4016-9e9d-9fadd306f540-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "56eb2a3f-a505-4016-9e9d-9fadd306f540" (UID: "56eb2a3f-a505-4016-9e9d-9fadd306f540"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.420309 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56eb2a3f-a505-4016-9e9d-9fadd306f540-kube-api-access-5bxh5" (OuterVolumeSpecName: "kube-api-access-5bxh5") pod "56eb2a3f-a505-4016-9e9d-9fadd306f540" (UID: "56eb2a3f-a505-4016-9e9d-9fadd306f540"). InnerVolumeSpecName "kube-api-access-5bxh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.517607 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/56eb2a3f-a505-4016-9e9d-9fadd306f540-kube-api-access-5bxh5\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.517643 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56eb2a3f-a505-4016-9e9d-9fadd306f540-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.517654 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56eb2a3f-a505-4016-9e9d-9fadd306f540-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.975258 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" event={"ID":"56eb2a3f-a505-4016-9e9d-9fadd306f540","Type":"ContainerDied","Data":"181d3d86cd79eeafceac4620562fe0302f6edb261b47133efcfc0ecd31b22a5c"} Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.975665 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="181d3d86cd79eeafceac4620562fe0302f6edb261b47133efcfc0ecd31b22a5c" Oct 06 13:15:03 crc kubenswrapper[4892]: I1006 13:15:03.975391 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk" Oct 06 13:15:04 crc kubenswrapper[4892]: I1006 13:15:04.411299 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f"] Oct 06 13:15:04 crc kubenswrapper[4892]: I1006 13:15:04.423863 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329230-2db6f"] Oct 06 13:15:05 crc kubenswrapper[4892]: I1006 13:15:05.168377 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:15:05 crc kubenswrapper[4892]: E1006 13:15:05.168646 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:15:06 crc kubenswrapper[4892]: I1006 13:15:06.192040 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf18611-f7e0-4b32-9ef1-4695b6e28af3" path="/var/lib/kubelet/pods/5bf18611-f7e0-4b32-9ef1-4695b6e28af3/volumes" Oct 06 13:15:20 crc kubenswrapper[4892]: I1006 13:15:20.169666 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:15:20 crc kubenswrapper[4892]: E1006 13:15:20.170595 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:15:35 crc kubenswrapper[4892]: I1006 13:15:35.169934 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:15:35 crc kubenswrapper[4892]: E1006 13:15:35.171235 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:15:47 crc kubenswrapper[4892]: I1006 13:15:47.170820 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:15:47 crc kubenswrapper[4892]: E1006 13:15:47.172388 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.184596 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2cq4d"] Oct 06 13:15:49 crc kubenswrapper[4892]: E1006 13:15:49.185565 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56eb2a3f-a505-4016-9e9d-9fadd306f540" containerName="collect-profiles" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.185583 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="56eb2a3f-a505-4016-9e9d-9fadd306f540" containerName="collect-profiles" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.185839 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="56eb2a3f-a505-4016-9e9d-9fadd306f540" containerName="collect-profiles" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.187693 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.200068 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-utilities\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.200117 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-catalog-content\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.200396 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvqvc\" (UniqueName: \"kubernetes.io/projected/a44c0747-afe9-4a9a-aa36-190954511ef3-kube-api-access-zvqvc\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.201765 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2cq4d"] Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.303147 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvqvc\" (UniqueName: \"kubernetes.io/projected/a44c0747-afe9-4a9a-aa36-190954511ef3-kube-api-access-zvqvc\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.303270 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-utilities\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.303297 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-catalog-content\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.303890 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-utilities\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.304418 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-catalog-content\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.333808 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvqvc\" (UniqueName: \"kubernetes.io/projected/a44c0747-afe9-4a9a-aa36-190954511ef3-kube-api-access-zvqvc\") pod \"certified-operators-2cq4d\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:49 crc kubenswrapper[4892]: I1006 13:15:49.507030 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:50 crc kubenswrapper[4892]: I1006 13:15:50.057218 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2cq4d"] Oct 06 13:15:50 crc kubenswrapper[4892]: W1006 13:15:50.064158 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda44c0747_afe9_4a9a_aa36_190954511ef3.slice/crio-cf90abb1a45281ae9bb2e475b9f98f5b70725ce4a03722e7c27a333cccd883b3 WatchSource:0}: Error finding container cf90abb1a45281ae9bb2e475b9f98f5b70725ce4a03722e7c27a333cccd883b3: Status 404 returned error can't find the container with id cf90abb1a45281ae9bb2e475b9f98f5b70725ce4a03722e7c27a333cccd883b3 Oct 06 13:15:50 crc kubenswrapper[4892]: I1006 13:15:50.527615 4892 generic.go:334] "Generic (PLEG): container finished" podID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerID="683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6" exitCode=0 Oct 06 13:15:50 crc kubenswrapper[4892]: I1006 13:15:50.527772 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cq4d" event={"ID":"a44c0747-afe9-4a9a-aa36-190954511ef3","Type":"ContainerDied","Data":"683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6"} Oct 06 13:15:50 crc kubenswrapper[4892]: I1006 13:15:50.528009 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cq4d" event={"ID":"a44c0747-afe9-4a9a-aa36-190954511ef3","Type":"ContainerStarted","Data":"cf90abb1a45281ae9bb2e475b9f98f5b70725ce4a03722e7c27a333cccd883b3"} Oct 06 13:15:50 crc kubenswrapper[4892]: I1006 13:15:50.531046 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:15:52 crc kubenswrapper[4892]: I1006 13:15:52.559481 4892 generic.go:334] "Generic (PLEG): container finished" podID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerID="9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5" exitCode=0 Oct 06 13:15:52 crc kubenswrapper[4892]: I1006 13:15:52.559563 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cq4d" event={"ID":"a44c0747-afe9-4a9a-aa36-190954511ef3","Type":"ContainerDied","Data":"9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5"} Oct 06 13:15:53 crc kubenswrapper[4892]: I1006 13:15:53.578148 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cq4d" event={"ID":"a44c0747-afe9-4a9a-aa36-190954511ef3","Type":"ContainerStarted","Data":"f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a"} Oct 06 13:15:53 crc kubenswrapper[4892]: I1006 13:15:53.612951 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2cq4d" podStartSLOduration=1.99365546 podStartE2EDuration="4.612932002s" podCreationTimestamp="2025-10-06 13:15:49 +0000 UTC" firstStartedPulling="2025-10-06 13:15:50.53063901 +0000 UTC m=+4037.080344805" lastFinishedPulling="2025-10-06 13:15:53.149915542 +0000 UTC m=+4039.699621347" observedRunningTime="2025-10-06 13:15:53.610042439 +0000 UTC m=+4040.159748224" watchObservedRunningTime="2025-10-06 13:15:53.612932002 +0000 UTC m=+4040.162637777" Oct 06 13:15:58 crc kubenswrapper[4892]: I1006 13:15:58.169111 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:15:58 crc kubenswrapper[4892]: E1006 13:15:58.169901 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:15:59 crc kubenswrapper[4892]: I1006 13:15:59.507817 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:59 crc kubenswrapper[4892]: I1006 13:15:59.508408 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:59 crc kubenswrapper[4892]: I1006 13:15:59.579426 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:59 crc kubenswrapper[4892]: I1006 13:15:59.712701 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:15:59 crc kubenswrapper[4892]: I1006 13:15:59.822551 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2cq4d"] Oct 06 13:16:01 crc kubenswrapper[4892]: I1006 13:16:01.679962 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2cq4d" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerName="registry-server" containerID="cri-o://f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a" gracePeriod=2 Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.234577 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.301467 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvqvc\" (UniqueName: \"kubernetes.io/projected/a44c0747-afe9-4a9a-aa36-190954511ef3-kube-api-access-zvqvc\") pod \"a44c0747-afe9-4a9a-aa36-190954511ef3\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.301530 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-utilities\") pod \"a44c0747-afe9-4a9a-aa36-190954511ef3\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.301592 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-catalog-content\") pod \"a44c0747-afe9-4a9a-aa36-190954511ef3\" (UID: \"a44c0747-afe9-4a9a-aa36-190954511ef3\") " Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.302540 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-utilities" (OuterVolumeSpecName: "utilities") pod "a44c0747-afe9-4a9a-aa36-190954511ef3" (UID: "a44c0747-afe9-4a9a-aa36-190954511ef3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.308024 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44c0747-afe9-4a9a-aa36-190954511ef3-kube-api-access-zvqvc" (OuterVolumeSpecName: "kube-api-access-zvqvc") pod "a44c0747-afe9-4a9a-aa36-190954511ef3" (UID: "a44c0747-afe9-4a9a-aa36-190954511ef3"). InnerVolumeSpecName "kube-api-access-zvqvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.404039 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvqvc\" (UniqueName: \"kubernetes.io/projected/a44c0747-afe9-4a9a-aa36-190954511ef3-kube-api-access-zvqvc\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.404073 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.595418 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a44c0747-afe9-4a9a-aa36-190954511ef3" (UID: "a44c0747-afe9-4a9a-aa36-190954511ef3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.608441 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a44c0747-afe9-4a9a-aa36-190954511ef3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.695060 4892 generic.go:334] "Generic (PLEG): container finished" podID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerID="f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a" exitCode=0 Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.695176 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2cq4d" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.695176 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cq4d" event={"ID":"a44c0747-afe9-4a9a-aa36-190954511ef3","Type":"ContainerDied","Data":"f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a"} Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.695658 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2cq4d" event={"ID":"a44c0747-afe9-4a9a-aa36-190954511ef3","Type":"ContainerDied","Data":"cf90abb1a45281ae9bb2e475b9f98f5b70725ce4a03722e7c27a333cccd883b3"} Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.695688 4892 scope.go:117] "RemoveContainer" containerID="f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.726384 4892 scope.go:117] "RemoveContainer" containerID="9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.752566 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2cq4d"] Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.761562 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2cq4d"] Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.774140 4892 scope.go:117] "RemoveContainer" containerID="683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.833676 4892 scope.go:117] "RemoveContainer" containerID="f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a" Oct 06 13:16:02 crc kubenswrapper[4892]: E1006 13:16:02.834072 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a\": container with ID starting with f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a not found: ID does not exist" containerID="f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.834103 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a"} err="failed to get container status \"f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a\": rpc error: code = NotFound desc = could not find container \"f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a\": container with ID starting with f74956ca549b710b1c2da10626df77c18796ceb9697fee97c41e46485e2dff1a not found: ID does not exist" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.834124 4892 scope.go:117] "RemoveContainer" containerID="9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5" Oct 06 13:16:02 crc kubenswrapper[4892]: E1006 13:16:02.834490 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5\": container with ID starting with 9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5 not found: ID does not exist" containerID="9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.834515 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5"} err="failed to get container status \"9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5\": rpc error: code = NotFound desc = could not find container \"9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5\": container with ID starting with 9f71e26c0c19166f4b89602a6e34240ac303d030c9e8d34c0e4f672accf957a5 not found: ID does not exist" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.834528 4892 scope.go:117] "RemoveContainer" containerID="683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6" Oct 06 13:16:02 crc kubenswrapper[4892]: E1006 13:16:02.834743 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6\": container with ID starting with 683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6 not found: ID does not exist" containerID="683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6" Oct 06 13:16:02 crc kubenswrapper[4892]: I1006 13:16:02.834765 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6"} err="failed to get container status \"683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6\": rpc error: code = NotFound desc = could not find container \"683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6\": container with ID starting with 683e61d4071ce01536c0cf5b7431571569da605269e5693e2fe8714ad0f7a9e6 not found: ID does not exist" Oct 06 13:16:03 crc kubenswrapper[4892]: I1006 13:16:03.408654 4892 scope.go:117] "RemoveContainer" containerID="9010a4dbf5b36a066fcac4dd65e36a415abaf6f044df5a5b865eb6b6b3ea307f" Oct 06 13:16:04 crc kubenswrapper[4892]: I1006 13:16:04.190276 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" path="/var/lib/kubelet/pods/a44c0747-afe9-4a9a-aa36-190954511ef3/volumes" Oct 06 13:16:11 crc kubenswrapper[4892]: I1006 13:16:11.168850 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:16:11 crc kubenswrapper[4892]: E1006 13:16:11.170158 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:16:22 crc kubenswrapper[4892]: I1006 13:16:22.169299 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:16:22 crc kubenswrapper[4892]: E1006 13:16:22.170704 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:16:35 crc kubenswrapper[4892]: I1006 13:16:35.169002 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:16:35 crc kubenswrapper[4892]: E1006 13:16:35.170299 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:16:50 crc kubenswrapper[4892]: I1006 13:16:50.169279 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:16:50 crc kubenswrapper[4892]: E1006 13:16:50.170497 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:17:01 crc kubenswrapper[4892]: I1006 13:17:01.168628 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:17:01 crc kubenswrapper[4892]: E1006 13:17:01.169926 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:17:16 crc kubenswrapper[4892]: I1006 13:17:16.169070 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:17:16 crc kubenswrapper[4892]: E1006 13:17:16.170389 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:17:31 crc kubenswrapper[4892]: I1006 13:17:31.169659 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:17:31 crc kubenswrapper[4892]: E1006 13:17:31.170894 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:17:44 crc kubenswrapper[4892]: I1006 13:17:44.182853 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:17:44 crc kubenswrapper[4892]: E1006 13:17:44.188637 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:17:57 crc kubenswrapper[4892]: I1006 13:17:57.169041 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:17:58 crc kubenswrapper[4892]: I1006 13:17:58.184730 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"f6a2358a9140b05b01d2246bb549c63ea90b67cdbb7a2c48a6afe206243e02e2"} Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.563886 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z859k"] Oct 06 13:18:13 crc kubenswrapper[4892]: E1006 13:18:13.564961 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerName="extract-utilities" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.564981 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerName="extract-utilities" Oct 06 13:18:13 crc kubenswrapper[4892]: E1006 13:18:13.565015 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerName="registry-server" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.565023 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerName="registry-server" Oct 06 13:18:13 crc kubenswrapper[4892]: E1006 13:18:13.565053 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerName="extract-content" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.565062 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerName="extract-content" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.565406 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44c0747-afe9-4a9a-aa36-190954511ef3" containerName="registry-server" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.567462 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.577410 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z859k"] Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.591721 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-catalog-content\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.591806 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-utilities\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.592027 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npj8b\" (UniqueName: \"kubernetes.io/projected/bdddce76-4093-4f86-9b19-7b9e666ef0ad-kube-api-access-npj8b\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.694344 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-catalog-content\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.694404 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-utilities\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.695002 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-catalog-content\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.695040 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-utilities\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.695205 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npj8b\" (UniqueName: \"kubernetes.io/projected/bdddce76-4093-4f86-9b19-7b9e666ef0ad-kube-api-access-npj8b\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.775406 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npj8b\" (UniqueName: \"kubernetes.io/projected/bdddce76-4093-4f86-9b19-7b9e666ef0ad-kube-api-access-npj8b\") pod \"redhat-marketplace-z859k\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:13 crc kubenswrapper[4892]: I1006 13:18:13.904596 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:14 crc kubenswrapper[4892]: I1006 13:18:14.436955 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z859k"] Oct 06 13:18:15 crc kubenswrapper[4892]: I1006 13:18:15.361310 4892 generic.go:334] "Generic (PLEG): container finished" podID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerID="33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44" exitCode=0 Oct 06 13:18:15 crc kubenswrapper[4892]: I1006 13:18:15.361376 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z859k" event={"ID":"bdddce76-4093-4f86-9b19-7b9e666ef0ad","Type":"ContainerDied","Data":"33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44"} Oct 06 13:18:15 crc kubenswrapper[4892]: I1006 13:18:15.361576 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z859k" event={"ID":"bdddce76-4093-4f86-9b19-7b9e666ef0ad","Type":"ContainerStarted","Data":"7f440550abf6d6562a28ffffcad403023ee05b539cb43094b314613f17d9da09"} Oct 06 13:18:16 crc kubenswrapper[4892]: I1006 13:18:16.378488 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z859k" event={"ID":"bdddce76-4093-4f86-9b19-7b9e666ef0ad","Type":"ContainerStarted","Data":"9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7"} Oct 06 13:18:17 crc kubenswrapper[4892]: I1006 13:18:17.389021 4892 generic.go:334] "Generic (PLEG): container finished" podID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerID="9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7" exitCode=0 Oct 06 13:18:17 crc kubenswrapper[4892]: I1006 13:18:17.389199 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z859k" event={"ID":"bdddce76-4093-4f86-9b19-7b9e666ef0ad","Type":"ContainerDied","Data":"9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7"} Oct 06 13:18:18 crc kubenswrapper[4892]: I1006 13:18:18.402505 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z859k" event={"ID":"bdddce76-4093-4f86-9b19-7b9e666ef0ad","Type":"ContainerStarted","Data":"caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3"} Oct 06 13:18:18 crc kubenswrapper[4892]: I1006 13:18:18.426109 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z859k" podStartSLOduration=2.984279924 podStartE2EDuration="5.426091391s" podCreationTimestamp="2025-10-06 13:18:13 +0000 UTC" firstStartedPulling="2025-10-06 13:18:15.363384066 +0000 UTC m=+4181.913089831" lastFinishedPulling="2025-10-06 13:18:17.805195533 +0000 UTC m=+4184.354901298" observedRunningTime="2025-10-06 13:18:18.423002951 +0000 UTC m=+4184.972708736" watchObservedRunningTime="2025-10-06 13:18:18.426091391 +0000 UTC m=+4184.975797156" Oct 06 13:18:23 crc kubenswrapper[4892]: I1006 13:18:23.905833 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:23 crc kubenswrapper[4892]: I1006 13:18:23.906449 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:23 crc kubenswrapper[4892]: I1006 13:18:23.994499 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:24 crc kubenswrapper[4892]: I1006 13:18:24.518442 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:24 crc kubenswrapper[4892]: I1006 13:18:24.585953 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z859k"] Oct 06 13:18:26 crc kubenswrapper[4892]: I1006 13:18:26.485342 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z859k" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerName="registry-server" containerID="cri-o://caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3" gracePeriod=2 Oct 06 13:18:26 crc kubenswrapper[4892]: I1006 13:18:26.959006 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.075987 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npj8b\" (UniqueName: \"kubernetes.io/projected/bdddce76-4093-4f86-9b19-7b9e666ef0ad-kube-api-access-npj8b\") pod \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.076061 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-catalog-content\") pod \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.076573 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-utilities\") pod \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\" (UID: \"bdddce76-4093-4f86-9b19-7b9e666ef0ad\") " Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.077775 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-utilities" (OuterVolumeSpecName: "utilities") pod "bdddce76-4093-4f86-9b19-7b9e666ef0ad" (UID: "bdddce76-4093-4f86-9b19-7b9e666ef0ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.086964 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdddce76-4093-4f86-9b19-7b9e666ef0ad-kube-api-access-npj8b" (OuterVolumeSpecName: "kube-api-access-npj8b") pod "bdddce76-4093-4f86-9b19-7b9e666ef0ad" (UID: "bdddce76-4093-4f86-9b19-7b9e666ef0ad"). InnerVolumeSpecName "kube-api-access-npj8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.105668 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdddce76-4093-4f86-9b19-7b9e666ef0ad" (UID: "bdddce76-4093-4f86-9b19-7b9e666ef0ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.179653 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npj8b\" (UniqueName: \"kubernetes.io/projected/bdddce76-4093-4f86-9b19-7b9e666ef0ad-kube-api-access-npj8b\") on node \"crc\" DevicePath \"\"" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.179685 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.179696 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdddce76-4093-4f86-9b19-7b9e666ef0ad-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.500651 4892 generic.go:334] "Generic (PLEG): container finished" podID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerID="caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3" exitCode=0 Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.500699 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z859k" event={"ID":"bdddce76-4093-4f86-9b19-7b9e666ef0ad","Type":"ContainerDied","Data":"caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3"} Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.500809 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z859k" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.500979 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z859k" event={"ID":"bdddce76-4093-4f86-9b19-7b9e666ef0ad","Type":"ContainerDied","Data":"7f440550abf6d6562a28ffffcad403023ee05b539cb43094b314613f17d9da09"} Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.500992 4892 scope.go:117] "RemoveContainer" containerID="caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.536763 4892 scope.go:117] "RemoveContainer" containerID="9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.553386 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z859k"] Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.562610 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z859k"] Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.575622 4892 scope.go:117] "RemoveContainer" containerID="33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.624622 4892 scope.go:117] "RemoveContainer" containerID="caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3" Oct 06 13:18:27 crc kubenswrapper[4892]: E1006 13:18:27.625442 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3\": container with ID starting with caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3 not found: ID does not exist" containerID="caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.625478 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3"} err="failed to get container status \"caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3\": rpc error: code = NotFound desc = could not find container \"caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3\": container with ID starting with caa4e2ea94ec7e7a8ea346047921dd13bd4f359987203bc946af635423bf48d3 not found: ID does not exist" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.625502 4892 scope.go:117] "RemoveContainer" containerID="9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7" Oct 06 13:18:27 crc kubenswrapper[4892]: E1006 13:18:27.626001 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7\": container with ID starting with 9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7 not found: ID does not exist" containerID="9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.626052 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7"} err="failed to get container status \"9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7\": rpc error: code = NotFound desc = could not find container \"9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7\": container with ID starting with 9efe115ac700bab47ce5ae320198f7795d6c6a0cb65125a065d994dfc26242e7 not found: ID does not exist" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.626081 4892 scope.go:117] "RemoveContainer" containerID="33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44" Oct 06 13:18:27 crc kubenswrapper[4892]: E1006 13:18:27.626397 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44\": container with ID starting with 33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44 not found: ID does not exist" containerID="33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44" Oct 06 13:18:27 crc kubenswrapper[4892]: I1006 13:18:27.626420 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44"} err="failed to get container status \"33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44\": rpc error: code = NotFound desc = could not find container \"33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44\": container with ID starting with 33490e8a51afc857759a7b5549b5b92a1519d9ea2a8b4c53593233cd88be8f44 not found: ID does not exist" Oct 06 13:18:28 crc kubenswrapper[4892]: I1006 13:18:28.186867 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" path="/var/lib/kubelet/pods/bdddce76-4093-4f86-9b19-7b9e666ef0ad/volumes" Oct 06 13:20:22 crc kubenswrapper[4892]: I1006 13:20:22.984313 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:20:22 crc kubenswrapper[4892]: I1006 13:20:22.985371 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:20:52 crc kubenswrapper[4892]: I1006 13:20:52.985311 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:20:52 crc kubenswrapper[4892]: I1006 13:20:52.986296 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.610582 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rctpl"] Oct 06 13:20:59 crc kubenswrapper[4892]: E1006 13:20:59.611473 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerName="registry-server" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.611486 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerName="registry-server" Oct 06 13:20:59 crc kubenswrapper[4892]: E1006 13:20:59.611505 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerName="extract-content" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.611511 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerName="extract-content" Oct 06 13:20:59 crc kubenswrapper[4892]: E1006 13:20:59.611530 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerName="extract-utilities" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.611538 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerName="extract-utilities" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.611734 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdddce76-4093-4f86-9b19-7b9e666ef0ad" containerName="registry-server" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.613197 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.627194 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rctpl"] Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.674626 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-catalog-content\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.674723 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-utilities\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.674796 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgpmb\" (UniqueName: \"kubernetes.io/projected/26b6c196-4f05-4ff7-a430-1951347c170c-kube-api-access-bgpmb\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.776236 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-utilities\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.776409 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgpmb\" (UniqueName: \"kubernetes.io/projected/26b6c196-4f05-4ff7-a430-1951347c170c-kube-api-access-bgpmb\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.776564 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-catalog-content\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.776946 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-utilities\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.776947 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-catalog-content\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.802845 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgpmb\" (UniqueName: \"kubernetes.io/projected/26b6c196-4f05-4ff7-a430-1951347c170c-kube-api-access-bgpmb\") pod \"community-operators-rctpl\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:20:59 crc kubenswrapper[4892]: I1006 13:20:59.947111 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:21:00 crc kubenswrapper[4892]: I1006 13:21:00.518179 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rctpl"] Oct 06 13:21:02 crc kubenswrapper[4892]: I1006 13:21:02.365437 4892 generic.go:334] "Generic (PLEG): container finished" podID="26b6c196-4f05-4ff7-a430-1951347c170c" containerID="15515ea45745b638d19fac8a6db780a1ba5f2e033738c18191903a32f03ce9fc" exitCode=0 Oct 06 13:21:02 crc kubenswrapper[4892]: I1006 13:21:02.367257 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rctpl" event={"ID":"26b6c196-4f05-4ff7-a430-1951347c170c","Type":"ContainerDied","Data":"15515ea45745b638d19fac8a6db780a1ba5f2e033738c18191903a32f03ce9fc"} Oct 06 13:21:02 crc kubenswrapper[4892]: I1006 13:21:02.367472 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rctpl" event={"ID":"26b6c196-4f05-4ff7-a430-1951347c170c","Type":"ContainerStarted","Data":"54c105e0657b90ee95ff05c50902a0019d359684836aa4bcc4807e5cef9e2db9"} Oct 06 13:21:02 crc kubenswrapper[4892]: I1006 13:21:02.370586 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:21:03 crc kubenswrapper[4892]: I1006 13:21:03.378639 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rctpl" event={"ID":"26b6c196-4f05-4ff7-a430-1951347c170c","Type":"ContainerStarted","Data":"47740143686e74fba18037e74ec6175149593a67f75ccdf92f351f450d7e5877"} Oct 06 13:21:04 crc kubenswrapper[4892]: I1006 13:21:04.396568 4892 generic.go:334] "Generic (PLEG): container finished" podID="26b6c196-4f05-4ff7-a430-1951347c170c" containerID="47740143686e74fba18037e74ec6175149593a67f75ccdf92f351f450d7e5877" exitCode=0 Oct 06 13:21:04 crc kubenswrapper[4892]: I1006 13:21:04.396639 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rctpl" event={"ID":"26b6c196-4f05-4ff7-a430-1951347c170c","Type":"ContainerDied","Data":"47740143686e74fba18037e74ec6175149593a67f75ccdf92f351f450d7e5877"} Oct 06 13:21:05 crc kubenswrapper[4892]: I1006 13:21:05.410697 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rctpl" event={"ID":"26b6c196-4f05-4ff7-a430-1951347c170c","Type":"ContainerStarted","Data":"65b6c4168d3f26dca8bc5cc0d46760741ec7f74f1e5b6c4263f753a6d991e847"} Oct 06 13:21:05 crc kubenswrapper[4892]: I1006 13:21:05.438417 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rctpl" podStartSLOduration=3.941356786 podStartE2EDuration="6.438397745s" podCreationTimestamp="2025-10-06 13:20:59 +0000 UTC" firstStartedPulling="2025-10-06 13:21:02.370263772 +0000 UTC m=+4348.919969527" lastFinishedPulling="2025-10-06 13:21:04.867304691 +0000 UTC m=+4351.417010486" observedRunningTime="2025-10-06 13:21:05.434862462 +0000 UTC m=+4351.984568227" watchObservedRunningTime="2025-10-06 13:21:05.438397745 +0000 UTC m=+4351.988103530" Oct 06 13:21:09 crc kubenswrapper[4892]: I1006 13:21:09.948074 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:21:09 crc kubenswrapper[4892]: I1006 13:21:09.948759 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:21:10 crc kubenswrapper[4892]: I1006 13:21:10.004180 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:21:10 crc kubenswrapper[4892]: I1006 13:21:10.531220 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:21:10 crc kubenswrapper[4892]: I1006 13:21:10.609558 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rctpl"] Oct 06 13:21:12 crc kubenswrapper[4892]: I1006 13:21:12.485646 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rctpl" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" containerName="registry-server" containerID="cri-o://65b6c4168d3f26dca8bc5cc0d46760741ec7f74f1e5b6c4263f753a6d991e847" gracePeriod=2 Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.503654 4892 generic.go:334] "Generic (PLEG): container finished" podID="26b6c196-4f05-4ff7-a430-1951347c170c" containerID="65b6c4168d3f26dca8bc5cc0d46760741ec7f74f1e5b6c4263f753a6d991e847" exitCode=0 Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.503742 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rctpl" event={"ID":"26b6c196-4f05-4ff7-a430-1951347c170c","Type":"ContainerDied","Data":"65b6c4168d3f26dca8bc5cc0d46760741ec7f74f1e5b6c4263f753a6d991e847"} Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.710723 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.783566 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgpmb\" (UniqueName: \"kubernetes.io/projected/26b6c196-4f05-4ff7-a430-1951347c170c-kube-api-access-bgpmb\") pod \"26b6c196-4f05-4ff7-a430-1951347c170c\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.783611 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-utilities\") pod \"26b6c196-4f05-4ff7-a430-1951347c170c\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.783670 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-catalog-content\") pod \"26b6c196-4f05-4ff7-a430-1951347c170c\" (UID: \"26b6c196-4f05-4ff7-a430-1951347c170c\") " Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.784906 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-utilities" (OuterVolumeSpecName: "utilities") pod "26b6c196-4f05-4ff7-a430-1951347c170c" (UID: "26b6c196-4f05-4ff7-a430-1951347c170c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.796257 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b6c196-4f05-4ff7-a430-1951347c170c-kube-api-access-bgpmb" (OuterVolumeSpecName: "kube-api-access-bgpmb") pod "26b6c196-4f05-4ff7-a430-1951347c170c" (UID: "26b6c196-4f05-4ff7-a430-1951347c170c"). InnerVolumeSpecName "kube-api-access-bgpmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.885948 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgpmb\" (UniqueName: \"kubernetes.io/projected/26b6c196-4f05-4ff7-a430-1951347c170c-kube-api-access-bgpmb\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:13 crc kubenswrapper[4892]: I1006 13:21:13.885989 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.223579 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26b6c196-4f05-4ff7-a430-1951347c170c" (UID: "26b6c196-4f05-4ff7-a430-1951347c170c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.294069 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b6c196-4f05-4ff7-a430-1951347c170c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.520400 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rctpl" event={"ID":"26b6c196-4f05-4ff7-a430-1951347c170c","Type":"ContainerDied","Data":"54c105e0657b90ee95ff05c50902a0019d359684836aa4bcc4807e5cef9e2db9"} Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.520465 4892 scope.go:117] "RemoveContainer" containerID="65b6c4168d3f26dca8bc5cc0d46760741ec7f74f1e5b6c4263f753a6d991e847" Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.520547 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rctpl" Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.561008 4892 scope.go:117] "RemoveContainer" containerID="47740143686e74fba18037e74ec6175149593a67f75ccdf92f351f450d7e5877" Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.578710 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rctpl"] Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.592422 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rctpl"] Oct 06 13:21:14 crc kubenswrapper[4892]: I1006 13:21:14.602889 4892 scope.go:117] "RemoveContainer" containerID="15515ea45745b638d19fac8a6db780a1ba5f2e033738c18191903a32f03ce9fc" Oct 06 13:21:16 crc kubenswrapper[4892]: I1006 13:21:16.186572 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" path="/var/lib/kubelet/pods/26b6c196-4f05-4ff7-a430-1951347c170c/volumes" Oct 06 13:21:22 crc kubenswrapper[4892]: I1006 13:21:22.984872 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:21:22 crc kubenswrapper[4892]: I1006 13:21:22.985809 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:21:22 crc kubenswrapper[4892]: I1006 13:21:22.985887 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:21:22 crc kubenswrapper[4892]: I1006 13:21:22.987432 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6a2358a9140b05b01d2246bb549c63ea90b67cdbb7a2c48a6afe206243e02e2"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:21:22 crc kubenswrapper[4892]: I1006 13:21:22.987549 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://f6a2358a9140b05b01d2246bb549c63ea90b67cdbb7a2c48a6afe206243e02e2" gracePeriod=600 Oct 06 13:21:23 crc kubenswrapper[4892]: I1006 13:21:23.637209 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="f6a2358a9140b05b01d2246bb549c63ea90b67cdbb7a2c48a6afe206243e02e2" exitCode=0 Oct 06 13:21:23 crc kubenswrapper[4892]: I1006 13:21:23.637854 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"f6a2358a9140b05b01d2246bb549c63ea90b67cdbb7a2c48a6afe206243e02e2"} Oct 06 13:21:23 crc kubenswrapper[4892]: I1006 13:21:23.637888 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982"} Oct 06 13:21:23 crc kubenswrapper[4892]: I1006 13:21:23.637909 4892 scope.go:117] "RemoveContainer" containerID="9b6006236fc6abe035c93714669ee39a175454e4722a6da959bdc4ddfea6c504" Oct 06 13:23:52 crc kubenswrapper[4892]: I1006 13:23:52.984427 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:23:52 crc kubenswrapper[4892]: I1006 13:23:52.985049 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:24:22 crc kubenswrapper[4892]: I1006 13:24:22.985110 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:24:22 crc kubenswrapper[4892]: I1006 13:24:22.986463 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:24:52 crc kubenswrapper[4892]: I1006 13:24:52.984277 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:24:52 crc kubenswrapper[4892]: I1006 13:24:52.984974 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:24:52 crc kubenswrapper[4892]: I1006 13:24:52.985104 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:24:52 crc kubenswrapper[4892]: I1006 13:24:52.986595 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:24:52 crc kubenswrapper[4892]: I1006 13:24:52.986707 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" gracePeriod=600 Oct 06 13:24:53 crc kubenswrapper[4892]: E1006 13:24:53.106081 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:24:53 crc kubenswrapper[4892]: I1006 13:24:53.166209 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" exitCode=0 Oct 06 13:24:53 crc kubenswrapper[4892]: I1006 13:24:53.166265 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982"} Oct 06 13:24:53 crc kubenswrapper[4892]: I1006 13:24:53.166311 4892 scope.go:117] "RemoveContainer" containerID="f6a2358a9140b05b01d2246bb549c63ea90b67cdbb7a2c48a6afe206243e02e2" Oct 06 13:24:53 crc kubenswrapper[4892]: I1006 13:24:53.166929 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:24:53 crc kubenswrapper[4892]: E1006 13:24:53.167301 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:25:07 crc kubenswrapper[4892]: I1006 13:25:07.168779 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:25:07 crc kubenswrapper[4892]: E1006 13:25:07.169822 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:25:22 crc kubenswrapper[4892]: I1006 13:25:22.172649 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:25:22 crc kubenswrapper[4892]: E1006 13:25:22.174039 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:25:36 crc kubenswrapper[4892]: I1006 13:25:36.169231 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:25:36 crc kubenswrapper[4892]: E1006 13:25:36.170215 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:25:51 crc kubenswrapper[4892]: I1006 13:25:51.169156 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:25:51 crc kubenswrapper[4892]: E1006 13:25:51.170025 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:26:06 crc kubenswrapper[4892]: I1006 13:26:06.169420 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:26:06 crc kubenswrapper[4892]: E1006 13:26:06.171535 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:26:20 crc kubenswrapper[4892]: I1006 13:26:20.169822 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:26:20 crc kubenswrapper[4892]: E1006 13:26:20.170625 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:26:20 crc kubenswrapper[4892]: I1006 13:26:20.941227 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lf7kd"] Oct 06 13:26:20 crc kubenswrapper[4892]: E1006 13:26:20.942032 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" containerName="extract-utilities" Oct 06 13:26:20 crc kubenswrapper[4892]: I1006 13:26:20.942070 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" containerName="extract-utilities" Oct 06 13:26:20 crc kubenswrapper[4892]: E1006 13:26:20.942103 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" containerName="registry-server" Oct 06 13:26:20 crc kubenswrapper[4892]: I1006 13:26:20.942116 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" containerName="registry-server" Oct 06 13:26:20 crc kubenswrapper[4892]: E1006 13:26:20.942142 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" containerName="extract-content" Oct 06 13:26:20 crc kubenswrapper[4892]: I1006 13:26:20.942181 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" containerName="extract-content" Oct 06 13:26:20 crc kubenswrapper[4892]: I1006 13:26:20.942616 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b6c196-4f05-4ff7-a430-1951347c170c" containerName="registry-server" Oct 06 13:26:20 crc kubenswrapper[4892]: I1006 13:26:20.945524 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:20 crc kubenswrapper[4892]: I1006 13:26:20.972001 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lf7kd"] Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.031763 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29z2n\" (UniqueName: \"kubernetes.io/projected/79bdd20d-6bbe-48b0-8717-6efce68efa76-kube-api-access-29z2n\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.031814 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-catalog-content\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.032047 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-utilities\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.134794 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-utilities\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.135210 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29z2n\" (UniqueName: \"kubernetes.io/projected/79bdd20d-6bbe-48b0-8717-6efce68efa76-kube-api-access-29z2n\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.135307 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-catalog-content\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.135536 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-utilities\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.136049 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-catalog-content\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.472572 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29z2n\" (UniqueName: \"kubernetes.io/projected/79bdd20d-6bbe-48b0-8717-6efce68efa76-kube-api-access-29z2n\") pod \"certified-operators-lf7kd\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:21 crc kubenswrapper[4892]: I1006 13:26:21.581194 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:22 crc kubenswrapper[4892]: I1006 13:26:22.115446 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lf7kd"] Oct 06 13:26:22 crc kubenswrapper[4892]: I1006 13:26:22.230139 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7kd" event={"ID":"79bdd20d-6bbe-48b0-8717-6efce68efa76","Type":"ContainerStarted","Data":"07da300f59f40094f1c244ce758f053d94d767231202ac4a5086365ce8d98ff6"} Oct 06 13:26:23 crc kubenswrapper[4892]: I1006 13:26:23.244521 4892 generic.go:334] "Generic (PLEG): container finished" podID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerID="3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73" exitCode=0 Oct 06 13:26:23 crc kubenswrapper[4892]: I1006 13:26:23.244681 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7kd" event={"ID":"79bdd20d-6bbe-48b0-8717-6efce68efa76","Type":"ContainerDied","Data":"3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73"} Oct 06 13:26:23 crc kubenswrapper[4892]: I1006 13:26:23.247287 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:26:25 crc kubenswrapper[4892]: I1006 13:26:25.265026 4892 generic.go:334] "Generic (PLEG): container finished" podID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerID="bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403" exitCode=0 Oct 06 13:26:25 crc kubenswrapper[4892]: I1006 13:26:25.265143 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7kd" event={"ID":"79bdd20d-6bbe-48b0-8717-6efce68efa76","Type":"ContainerDied","Data":"bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403"} Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.279909 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7kd" event={"ID":"79bdd20d-6bbe-48b0-8717-6efce68efa76","Type":"ContainerStarted","Data":"f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d"} Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.305718 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lf7kd" podStartSLOduration=3.725472451 podStartE2EDuration="6.305685731s" podCreationTimestamp="2025-10-06 13:26:20 +0000 UTC" firstStartedPulling="2025-10-06 13:26:23.246894558 +0000 UTC m=+4669.796600363" lastFinishedPulling="2025-10-06 13:26:25.827107888 +0000 UTC m=+4672.376813643" observedRunningTime="2025-10-06 13:26:26.299841061 +0000 UTC m=+4672.849546826" watchObservedRunningTime="2025-10-06 13:26:26.305685731 +0000 UTC m=+4672.855391536" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.365409 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x2qnz"] Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.367538 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x2qnz"] Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.367616 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.447572 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzls8\" (UniqueName: \"kubernetes.io/projected/4539f498-2033-44f8-9072-0bc0432de16e-kube-api-access-fzls8\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.447709 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-catalog-content\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.448034 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-utilities\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.550387 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-utilities\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.550449 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzls8\" (UniqueName: \"kubernetes.io/projected/4539f498-2033-44f8-9072-0bc0432de16e-kube-api-access-fzls8\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.550537 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-catalog-content\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.550931 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-utilities\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.551025 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-catalog-content\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.579203 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzls8\" (UniqueName: \"kubernetes.io/projected/4539f498-2033-44f8-9072-0bc0432de16e-kube-api-access-fzls8\") pod \"redhat-operators-x2qnz\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:26 crc kubenswrapper[4892]: I1006 13:26:26.722457 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:27 crc kubenswrapper[4892]: I1006 13:26:27.218911 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x2qnz"] Oct 06 13:26:27 crc kubenswrapper[4892]: W1006 13:26:27.221124 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4539f498_2033_44f8_9072_0bc0432de16e.slice/crio-38a7c63261ff1689f295097373eda8e4499f06998e32510d98720a716670203d WatchSource:0}: Error finding container 38a7c63261ff1689f295097373eda8e4499f06998e32510d98720a716670203d: Status 404 returned error can't find the container with id 38a7c63261ff1689f295097373eda8e4499f06998e32510d98720a716670203d Oct 06 13:26:27 crc kubenswrapper[4892]: I1006 13:26:27.289048 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2qnz" event={"ID":"4539f498-2033-44f8-9072-0bc0432de16e","Type":"ContainerStarted","Data":"38a7c63261ff1689f295097373eda8e4499f06998e32510d98720a716670203d"} Oct 06 13:26:28 crc kubenswrapper[4892]: I1006 13:26:28.300239 4892 generic.go:334] "Generic (PLEG): container finished" podID="4539f498-2033-44f8-9072-0bc0432de16e" containerID="4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0" exitCode=0 Oct 06 13:26:28 crc kubenswrapper[4892]: I1006 13:26:28.300293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2qnz" event={"ID":"4539f498-2033-44f8-9072-0bc0432de16e","Type":"ContainerDied","Data":"4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0"} Oct 06 13:26:30 crc kubenswrapper[4892]: I1006 13:26:30.328704 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2qnz" event={"ID":"4539f498-2033-44f8-9072-0bc0432de16e","Type":"ContainerStarted","Data":"b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390"} Oct 06 13:26:31 crc kubenswrapper[4892]: I1006 13:26:31.340972 4892 generic.go:334] "Generic (PLEG): container finished" podID="4539f498-2033-44f8-9072-0bc0432de16e" containerID="b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390" exitCode=0 Oct 06 13:26:31 crc kubenswrapper[4892]: I1006 13:26:31.341049 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2qnz" event={"ID":"4539f498-2033-44f8-9072-0bc0432de16e","Type":"ContainerDied","Data":"b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390"} Oct 06 13:26:31 crc kubenswrapper[4892]: I1006 13:26:31.582458 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:31 crc kubenswrapper[4892]: I1006 13:26:31.582763 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:31 crc kubenswrapper[4892]: I1006 13:26:31.654135 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:32 crc kubenswrapper[4892]: I1006 13:26:32.353303 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2qnz" event={"ID":"4539f498-2033-44f8-9072-0bc0432de16e","Type":"ContainerStarted","Data":"07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8"} Oct 06 13:26:32 crc kubenswrapper[4892]: I1006 13:26:32.388682 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x2qnz" podStartSLOduration=2.823928134 podStartE2EDuration="6.38864519s" podCreationTimestamp="2025-10-06 13:26:26 +0000 UTC" firstStartedPulling="2025-10-06 13:26:28.302421677 +0000 UTC m=+4674.852127442" lastFinishedPulling="2025-10-06 13:26:31.867138713 +0000 UTC m=+4678.416844498" observedRunningTime="2025-10-06 13:26:32.382268925 +0000 UTC m=+4678.931974700" watchObservedRunningTime="2025-10-06 13:26:32.38864519 +0000 UTC m=+4678.938351055" Oct 06 13:26:32 crc kubenswrapper[4892]: I1006 13:26:32.405744 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:33 crc kubenswrapper[4892]: I1006 13:26:33.920825 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lf7kd"] Oct 06 13:26:34 crc kubenswrapper[4892]: I1006 13:26:34.176426 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:26:34 crc kubenswrapper[4892]: E1006 13:26:34.176742 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:26:35 crc kubenswrapper[4892]: I1006 13:26:35.385204 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lf7kd" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerName="registry-server" containerID="cri-o://f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d" gracePeriod=2 Oct 06 13:26:35 crc kubenswrapper[4892]: I1006 13:26:35.886907 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:35 crc kubenswrapper[4892]: I1006 13:26:35.968457 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-catalog-content\") pod \"79bdd20d-6bbe-48b0-8717-6efce68efa76\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " Oct 06 13:26:35 crc kubenswrapper[4892]: I1006 13:26:35.968599 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29z2n\" (UniqueName: \"kubernetes.io/projected/79bdd20d-6bbe-48b0-8717-6efce68efa76-kube-api-access-29z2n\") pod \"79bdd20d-6bbe-48b0-8717-6efce68efa76\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " Oct 06 13:26:35 crc kubenswrapper[4892]: I1006 13:26:35.968641 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-utilities\") pod \"79bdd20d-6bbe-48b0-8717-6efce68efa76\" (UID: \"79bdd20d-6bbe-48b0-8717-6efce68efa76\") " Oct 06 13:26:35 crc kubenswrapper[4892]: I1006 13:26:35.969536 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-utilities" (OuterVolumeSpecName: "utilities") pod "79bdd20d-6bbe-48b0-8717-6efce68efa76" (UID: "79bdd20d-6bbe-48b0-8717-6efce68efa76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:26:35 crc kubenswrapper[4892]: I1006 13:26:35.977716 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79bdd20d-6bbe-48b0-8717-6efce68efa76-kube-api-access-29z2n" (OuterVolumeSpecName: "kube-api-access-29z2n") pod "79bdd20d-6bbe-48b0-8717-6efce68efa76" (UID: "79bdd20d-6bbe-48b0-8717-6efce68efa76"). InnerVolumeSpecName "kube-api-access-29z2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.017808 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79bdd20d-6bbe-48b0-8717-6efce68efa76" (UID: "79bdd20d-6bbe-48b0-8717-6efce68efa76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.071545 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29z2n\" (UniqueName: \"kubernetes.io/projected/79bdd20d-6bbe-48b0-8717-6efce68efa76-kube-api-access-29z2n\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.071590 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.071602 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79bdd20d-6bbe-48b0-8717-6efce68efa76-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.397611 4892 generic.go:334] "Generic (PLEG): container finished" podID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerID="f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d" exitCode=0 Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.397680 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7kd" event={"ID":"79bdd20d-6bbe-48b0-8717-6efce68efa76","Type":"ContainerDied","Data":"f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d"} Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.397723 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lf7kd" event={"ID":"79bdd20d-6bbe-48b0-8717-6efce68efa76","Type":"ContainerDied","Data":"07da300f59f40094f1c244ce758f053d94d767231202ac4a5086365ce8d98ff6"} Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.397758 4892 scope.go:117] "RemoveContainer" containerID="f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.397686 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lf7kd" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.430558 4892 scope.go:117] "RemoveContainer" containerID="bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.433365 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lf7kd"] Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.450202 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lf7kd"] Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.464835 4892 scope.go:117] "RemoveContainer" containerID="3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.536971 4892 scope.go:117] "RemoveContainer" containerID="f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d" Oct 06 13:26:36 crc kubenswrapper[4892]: E1006 13:26:36.537500 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d\": container with ID starting with f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d not found: ID does not exist" containerID="f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.537537 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d"} err="failed to get container status \"f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d\": rpc error: code = NotFound desc = could not find container \"f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d\": container with ID starting with f19b45eaa74f3cec6db277b970140931c65cd00072a6259708ef149fafeb256d not found: ID does not exist" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.537565 4892 scope.go:117] "RemoveContainer" containerID="bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403" Oct 06 13:26:36 crc kubenswrapper[4892]: E1006 13:26:36.537852 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403\": container with ID starting with bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403 not found: ID does not exist" containerID="bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.537869 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403"} err="failed to get container status \"bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403\": rpc error: code = NotFound desc = could not find container \"bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403\": container with ID starting with bf60b8a0566c4a67b2401c09f3cdc2afa0d24b2ae07883fd0e3c222d10788403 not found: ID does not exist" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.537884 4892 scope.go:117] "RemoveContainer" containerID="3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73" Oct 06 13:26:36 crc kubenswrapper[4892]: E1006 13:26:36.538135 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73\": container with ID starting with 3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73 not found: ID does not exist" containerID="3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.538154 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73"} err="failed to get container status \"3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73\": rpc error: code = NotFound desc = could not find container \"3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73\": container with ID starting with 3d217b647d849affec5b931a3eb86ec839faa019ed99018fa02e280c255fcd73 not found: ID does not exist" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.722682 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:36 crc kubenswrapper[4892]: I1006 13:26:36.722756 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:37 crc kubenswrapper[4892]: I1006 13:26:37.792316 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x2qnz" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="registry-server" probeResult="failure" output=< Oct 06 13:26:37 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Oct 06 13:26:37 crc kubenswrapper[4892]: > Oct 06 13:26:38 crc kubenswrapper[4892]: I1006 13:26:38.210896 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" path="/var/lib/kubelet/pods/79bdd20d-6bbe-48b0-8717-6efce68efa76/volumes" Oct 06 13:26:46 crc kubenswrapper[4892]: I1006 13:26:46.170262 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:26:46 crc kubenswrapper[4892]: E1006 13:26:46.172104 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:26:46 crc kubenswrapper[4892]: I1006 13:26:46.784787 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:46 crc kubenswrapper[4892]: I1006 13:26:46.870637 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:47 crc kubenswrapper[4892]: I1006 13:26:47.034539 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x2qnz"] Oct 06 13:26:48 crc kubenswrapper[4892]: I1006 13:26:48.531537 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x2qnz" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="registry-server" containerID="cri-o://07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8" gracePeriod=2 Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.063209 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.151130 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-catalog-content\") pod \"4539f498-2033-44f8-9072-0bc0432de16e\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.151323 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-utilities\") pod \"4539f498-2033-44f8-9072-0bc0432de16e\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.151413 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzls8\" (UniqueName: \"kubernetes.io/projected/4539f498-2033-44f8-9072-0bc0432de16e-kube-api-access-fzls8\") pod \"4539f498-2033-44f8-9072-0bc0432de16e\" (UID: \"4539f498-2033-44f8-9072-0bc0432de16e\") " Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.152303 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-utilities" (OuterVolumeSpecName: "utilities") pod "4539f498-2033-44f8-9072-0bc0432de16e" (UID: "4539f498-2033-44f8-9072-0bc0432de16e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.152713 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.157595 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4539f498-2033-44f8-9072-0bc0432de16e-kube-api-access-fzls8" (OuterVolumeSpecName: "kube-api-access-fzls8") pod "4539f498-2033-44f8-9072-0bc0432de16e" (UID: "4539f498-2033-44f8-9072-0bc0432de16e"). InnerVolumeSpecName "kube-api-access-fzls8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.249140 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4539f498-2033-44f8-9072-0bc0432de16e" (UID: "4539f498-2033-44f8-9072-0bc0432de16e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.254632 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4539f498-2033-44f8-9072-0bc0432de16e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.254662 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzls8\" (UniqueName: \"kubernetes.io/projected/4539f498-2033-44f8-9072-0bc0432de16e-kube-api-access-fzls8\") on node \"crc\" DevicePath \"\"" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.542252 4892 generic.go:334] "Generic (PLEG): container finished" podID="4539f498-2033-44f8-9072-0bc0432de16e" containerID="07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8" exitCode=0 Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.542313 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2qnz" event={"ID":"4539f498-2033-44f8-9072-0bc0432de16e","Type":"ContainerDied","Data":"07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8"} Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.542386 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x2qnz" event={"ID":"4539f498-2033-44f8-9072-0bc0432de16e","Type":"ContainerDied","Data":"38a7c63261ff1689f295097373eda8e4499f06998e32510d98720a716670203d"} Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.542392 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x2qnz" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.542417 4892 scope.go:117] "RemoveContainer" containerID="07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.568625 4892 scope.go:117] "RemoveContainer" containerID="b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.597666 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x2qnz"] Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.611260 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x2qnz"] Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.631026 4892 scope.go:117] "RemoveContainer" containerID="4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.677582 4892 scope.go:117] "RemoveContainer" containerID="07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8" Oct 06 13:26:49 crc kubenswrapper[4892]: E1006 13:26:49.678328 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8\": container with ID starting with 07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8 not found: ID does not exist" containerID="07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.678443 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8"} err="failed to get container status \"07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8\": rpc error: code = NotFound desc = could not find container \"07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8\": container with ID starting with 07d4d3884197931c7b033d228dbecc2da50b5628316c37abda0c335fa5106db8 not found: ID does not exist" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.678476 4892 scope.go:117] "RemoveContainer" containerID="b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390" Oct 06 13:26:49 crc kubenswrapper[4892]: E1006 13:26:49.678935 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390\": container with ID starting with b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390 not found: ID does not exist" containerID="b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.678973 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390"} err="failed to get container status \"b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390\": rpc error: code = NotFound desc = could not find container \"b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390\": container with ID starting with b6862690911ca7dffee2e3ee0f748b56bffe1c3d6a6c03b5666f0af327f63390 not found: ID does not exist" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.679000 4892 scope.go:117] "RemoveContainer" containerID="4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0" Oct 06 13:26:49 crc kubenswrapper[4892]: E1006 13:26:49.679505 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0\": container with ID starting with 4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0 not found: ID does not exist" containerID="4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0" Oct 06 13:26:49 crc kubenswrapper[4892]: I1006 13:26:49.679539 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0"} err="failed to get container status \"4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0\": rpc error: code = NotFound desc = could not find container \"4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0\": container with ID starting with 4901face8e3952720098c73c5ffe9d0dc0a61d2ccc5b239d532357e63f8381e0 not found: ID does not exist" Oct 06 13:26:50 crc kubenswrapper[4892]: I1006 13:26:50.187707 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4539f498-2033-44f8-9072-0bc0432de16e" path="/var/lib/kubelet/pods/4539f498-2033-44f8-9072-0bc0432de16e/volumes" Oct 06 13:26:58 crc kubenswrapper[4892]: I1006 13:26:58.170315 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:26:58 crc kubenswrapper[4892]: E1006 13:26:58.171299 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:27:12 crc kubenswrapper[4892]: I1006 13:27:12.171908 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:27:12 crc kubenswrapper[4892]: E1006 13:27:12.174424 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:27:26 crc kubenswrapper[4892]: I1006 13:27:26.169523 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:27:26 crc kubenswrapper[4892]: E1006 13:27:26.172581 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:27:40 crc kubenswrapper[4892]: I1006 13:27:40.169113 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:27:40 crc kubenswrapper[4892]: E1006 13:27:40.170152 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:27:52 crc kubenswrapper[4892]: I1006 13:27:52.169202 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:27:52 crc kubenswrapper[4892]: E1006 13:27:52.170523 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:28:04 crc kubenswrapper[4892]: I1006 13:28:04.182345 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:28:04 crc kubenswrapper[4892]: E1006 13:28:04.183410 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:28:17 crc kubenswrapper[4892]: I1006 13:28:17.168410 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:28:17 crc kubenswrapper[4892]: E1006 13:28:17.169494 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.466303 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-96rx8"] Oct 06 13:28:19 crc kubenswrapper[4892]: E1006 13:28:19.467572 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="registry-server" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.467600 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="registry-server" Oct 06 13:28:19 crc kubenswrapper[4892]: E1006 13:28:19.467637 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerName="registry-server" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.467655 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerName="registry-server" Oct 06 13:28:19 crc kubenswrapper[4892]: E1006 13:28:19.467706 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerName="extract-utilities" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.467720 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerName="extract-utilities" Oct 06 13:28:19 crc kubenswrapper[4892]: E1006 13:28:19.467784 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerName="extract-content" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.467797 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerName="extract-content" Oct 06 13:28:19 crc kubenswrapper[4892]: E1006 13:28:19.467827 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="extract-content" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.467841 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="extract-content" Oct 06 13:28:19 crc kubenswrapper[4892]: E1006 13:28:19.467873 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="extract-utilities" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.467884 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="extract-utilities" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.468257 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4539f498-2033-44f8-9072-0bc0432de16e" containerName="registry-server" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.468311 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="79bdd20d-6bbe-48b0-8717-6efce68efa76" containerName="registry-server" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.471733 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.476175 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-96rx8"] Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.648915 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-utilities\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.649080 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-catalog-content\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.649525 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr8xz\" (UniqueName: \"kubernetes.io/projected/69aa0667-1765-4397-a6df-b551dc95c973-kube-api-access-zr8xz\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.751104 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr8xz\" (UniqueName: \"kubernetes.io/projected/69aa0667-1765-4397-a6df-b551dc95c973-kube-api-access-zr8xz\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.751450 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-utilities\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.751571 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-catalog-content\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.752055 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-utilities\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.752109 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-catalog-content\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.773468 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr8xz\" (UniqueName: \"kubernetes.io/projected/69aa0667-1765-4397-a6df-b551dc95c973-kube-api-access-zr8xz\") pod \"redhat-marketplace-96rx8\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:19 crc kubenswrapper[4892]: I1006 13:28:19.798260 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:20 crc kubenswrapper[4892]: I1006 13:28:20.326480 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-96rx8"] Oct 06 13:28:20 crc kubenswrapper[4892]: I1006 13:28:20.671222 4892 generic.go:334] "Generic (PLEG): container finished" podID="69aa0667-1765-4397-a6df-b551dc95c973" containerID="3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e" exitCode=0 Oct 06 13:28:20 crc kubenswrapper[4892]: I1006 13:28:20.671294 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96rx8" event={"ID":"69aa0667-1765-4397-a6df-b551dc95c973","Type":"ContainerDied","Data":"3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e"} Oct 06 13:28:20 crc kubenswrapper[4892]: I1006 13:28:20.671380 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96rx8" event={"ID":"69aa0667-1765-4397-a6df-b551dc95c973","Type":"ContainerStarted","Data":"d1f8e7d2bd065c31c9c161d187f93f819d5dd153ec5096a3bcf90f3c98df7b60"} Oct 06 13:28:22 crc kubenswrapper[4892]: I1006 13:28:22.705611 4892 generic.go:334] "Generic (PLEG): container finished" podID="69aa0667-1765-4397-a6df-b551dc95c973" containerID="00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27" exitCode=0 Oct 06 13:28:22 crc kubenswrapper[4892]: I1006 13:28:22.705735 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96rx8" event={"ID":"69aa0667-1765-4397-a6df-b551dc95c973","Type":"ContainerDied","Data":"00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27"} Oct 06 13:28:23 crc kubenswrapper[4892]: I1006 13:28:23.720690 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96rx8" event={"ID":"69aa0667-1765-4397-a6df-b551dc95c973","Type":"ContainerStarted","Data":"e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694"} Oct 06 13:28:23 crc kubenswrapper[4892]: I1006 13:28:23.760125 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-96rx8" podStartSLOduration=2.149882382 podStartE2EDuration="4.760094721s" podCreationTimestamp="2025-10-06 13:28:19 +0000 UTC" firstStartedPulling="2025-10-06 13:28:20.673743491 +0000 UTC m=+4787.223449306" lastFinishedPulling="2025-10-06 13:28:23.28395587 +0000 UTC m=+4789.833661645" observedRunningTime="2025-10-06 13:28:23.755172638 +0000 UTC m=+4790.304878403" watchObservedRunningTime="2025-10-06 13:28:23.760094721 +0000 UTC m=+4790.309800476" Oct 06 13:28:28 crc kubenswrapper[4892]: I1006 13:28:28.169706 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:28:28 crc kubenswrapper[4892]: E1006 13:28:28.170817 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:28:29 crc kubenswrapper[4892]: I1006 13:28:29.798933 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:29 crc kubenswrapper[4892]: I1006 13:28:29.799473 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:29 crc kubenswrapper[4892]: I1006 13:28:29.882238 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:30 crc kubenswrapper[4892]: I1006 13:28:30.889998 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:30 crc kubenswrapper[4892]: I1006 13:28:30.956188 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-96rx8"] Oct 06 13:28:32 crc kubenswrapper[4892]: I1006 13:28:32.861810 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-96rx8" podUID="69aa0667-1765-4397-a6df-b551dc95c973" containerName="registry-server" containerID="cri-o://e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694" gracePeriod=2 Oct 06 13:28:33 crc kubenswrapper[4892]: E1006 13:28:33.094928 4892 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.144:60862->38.102.83.144:40237: write tcp 38.102.83.144:60862->38.102.83.144:40237: write: broken pipe Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.340303 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.482636 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-utilities\") pod \"69aa0667-1765-4397-a6df-b551dc95c973\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.482887 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr8xz\" (UniqueName: \"kubernetes.io/projected/69aa0667-1765-4397-a6df-b551dc95c973-kube-api-access-zr8xz\") pod \"69aa0667-1765-4397-a6df-b551dc95c973\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.482945 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-catalog-content\") pod \"69aa0667-1765-4397-a6df-b551dc95c973\" (UID: \"69aa0667-1765-4397-a6df-b551dc95c973\") " Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.483898 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-utilities" (OuterVolumeSpecName: "utilities") pod "69aa0667-1765-4397-a6df-b551dc95c973" (UID: "69aa0667-1765-4397-a6df-b551dc95c973"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.488899 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69aa0667-1765-4397-a6df-b551dc95c973-kube-api-access-zr8xz" (OuterVolumeSpecName: "kube-api-access-zr8xz") pod "69aa0667-1765-4397-a6df-b551dc95c973" (UID: "69aa0667-1765-4397-a6df-b551dc95c973"). InnerVolumeSpecName "kube-api-access-zr8xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.496510 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69aa0667-1765-4397-a6df-b551dc95c973" (UID: "69aa0667-1765-4397-a6df-b551dc95c973"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.585028 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr8xz\" (UniqueName: \"kubernetes.io/projected/69aa0667-1765-4397-a6df-b551dc95c973-kube-api-access-zr8xz\") on node \"crc\" DevicePath \"\"" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.585061 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.585072 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69aa0667-1765-4397-a6df-b551dc95c973-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.875128 4892 generic.go:334] "Generic (PLEG): container finished" podID="69aa0667-1765-4397-a6df-b551dc95c973" containerID="e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694" exitCode=0 Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.875192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96rx8" event={"ID":"69aa0667-1765-4397-a6df-b551dc95c973","Type":"ContainerDied","Data":"e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694"} Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.875226 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-96rx8" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.875244 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-96rx8" event={"ID":"69aa0667-1765-4397-a6df-b551dc95c973","Type":"ContainerDied","Data":"d1f8e7d2bd065c31c9c161d187f93f819d5dd153ec5096a3bcf90f3c98df7b60"} Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.875280 4892 scope.go:117] "RemoveContainer" containerID="e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.907580 4892 scope.go:117] "RemoveContainer" containerID="00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.931275 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-96rx8"] Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.942743 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-96rx8"] Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.946392 4892 scope.go:117] "RemoveContainer" containerID="3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.996264 4892 scope.go:117] "RemoveContainer" containerID="e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694" Oct 06 13:28:33 crc kubenswrapper[4892]: E1006 13:28:33.997116 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694\": container with ID starting with e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694 not found: ID does not exist" containerID="e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.997163 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694"} err="failed to get container status \"e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694\": rpc error: code = NotFound desc = could not find container \"e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694\": container with ID starting with e63bbf2d21f6ea35c0bdde4e6742124d065ad691b4a8b5a377060f7fe271d694 not found: ID does not exist" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.997196 4892 scope.go:117] "RemoveContainer" containerID="00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27" Oct 06 13:28:33 crc kubenswrapper[4892]: E1006 13:28:33.997578 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27\": container with ID starting with 00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27 not found: ID does not exist" containerID="00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.997631 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27"} err="failed to get container status \"00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27\": rpc error: code = NotFound desc = could not find container \"00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27\": container with ID starting with 00d5ff635525e645bec21d29cce9068f592e9b90c5191d7150815ca82461ab27 not found: ID does not exist" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.997663 4892 scope.go:117] "RemoveContainer" containerID="3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e" Oct 06 13:28:33 crc kubenswrapper[4892]: E1006 13:28:33.998079 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e\": container with ID starting with 3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e not found: ID does not exist" containerID="3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e" Oct 06 13:28:33 crc kubenswrapper[4892]: I1006 13:28:33.998104 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e"} err="failed to get container status \"3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e\": rpc error: code = NotFound desc = could not find container \"3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e\": container with ID starting with 3bc66dee839aa54e16244063b2625b3454a17c12451ee1cb576c357710e0296e not found: ID does not exist" Oct 06 13:28:34 crc kubenswrapper[4892]: I1006 13:28:34.186759 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69aa0667-1765-4397-a6df-b551dc95c973" path="/var/lib/kubelet/pods/69aa0667-1765-4397-a6df-b551dc95c973/volumes" Oct 06 13:28:39 crc kubenswrapper[4892]: I1006 13:28:39.168627 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:28:39 crc kubenswrapper[4892]: E1006 13:28:39.169413 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:28:51 crc kubenswrapper[4892]: I1006 13:28:51.168506 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:28:51 crc kubenswrapper[4892]: E1006 13:28:51.169670 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:29:02 crc kubenswrapper[4892]: I1006 13:29:02.169661 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:29:02 crc kubenswrapper[4892]: E1006 13:29:02.170955 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:29:14 crc kubenswrapper[4892]: I1006 13:29:14.181515 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:29:14 crc kubenswrapper[4892]: E1006 13:29:14.182276 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:29:26 crc kubenswrapper[4892]: I1006 13:29:26.175710 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:29:26 crc kubenswrapper[4892]: E1006 13:29:26.177819 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:29:39 crc kubenswrapper[4892]: I1006 13:29:39.170797 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:29:39 crc kubenswrapper[4892]: E1006 13:29:39.171926 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:29:53 crc kubenswrapper[4892]: I1006 13:29:53.168851 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:29:53 crc kubenswrapper[4892]: I1006 13:29:53.855830 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"220e2ff1d4e6a2bbe3295946c774510d2c996f6c1166889e882de6ab46c2213f"} Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.155411 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t"] Oct 06 13:30:00 crc kubenswrapper[4892]: E1006 13:30:00.156605 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69aa0667-1765-4397-a6df-b551dc95c973" containerName="extract-content" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.156626 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="69aa0667-1765-4397-a6df-b551dc95c973" containerName="extract-content" Oct 06 13:30:00 crc kubenswrapper[4892]: E1006 13:30:00.156655 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69aa0667-1765-4397-a6df-b551dc95c973" containerName="registry-server" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.156666 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="69aa0667-1765-4397-a6df-b551dc95c973" containerName="registry-server" Oct 06 13:30:00 crc kubenswrapper[4892]: E1006 13:30:00.156727 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69aa0667-1765-4397-a6df-b551dc95c973" containerName="extract-utilities" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.156740 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="69aa0667-1765-4397-a6df-b551dc95c973" containerName="extract-utilities" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.157059 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="69aa0667-1765-4397-a6df-b551dc95c973" containerName="registry-server" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.158223 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.160741 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.161190 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.205828 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t"] Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.215108 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pk5p\" (UniqueName: \"kubernetes.io/projected/39438910-e0d8-4209-8980-377efa5d9e44-kube-api-access-9pk5p\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.215361 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39438910-e0d8-4209-8980-377efa5d9e44-secret-volume\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.215444 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39438910-e0d8-4209-8980-377efa5d9e44-config-volume\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.317568 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pk5p\" (UniqueName: \"kubernetes.io/projected/39438910-e0d8-4209-8980-377efa5d9e44-kube-api-access-9pk5p\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.317633 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39438910-e0d8-4209-8980-377efa5d9e44-secret-volume\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.317670 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39438910-e0d8-4209-8980-377efa5d9e44-config-volume\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.318778 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39438910-e0d8-4209-8980-377efa5d9e44-config-volume\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.324674 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39438910-e0d8-4209-8980-377efa5d9e44-secret-volume\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.334083 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pk5p\" (UniqueName: \"kubernetes.io/projected/39438910-e0d8-4209-8980-377efa5d9e44-kube-api-access-9pk5p\") pod \"collect-profiles-29329290-7dv9t\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.493635 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:00 crc kubenswrapper[4892]: I1006 13:30:00.989980 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t"] Oct 06 13:30:00 crc kubenswrapper[4892]: W1006 13:30:00.991845 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39438910_e0d8_4209_8980_377efa5d9e44.slice/crio-e8cb18b8157a8c649b1dbe3ef987d2973f05588ca9123a3789a3e2579da579b3 WatchSource:0}: Error finding container e8cb18b8157a8c649b1dbe3ef987d2973f05588ca9123a3789a3e2579da579b3: Status 404 returned error can't find the container with id e8cb18b8157a8c649b1dbe3ef987d2973f05588ca9123a3789a3e2579da579b3 Oct 06 13:30:01 crc kubenswrapper[4892]: I1006 13:30:01.940762 4892 generic.go:334] "Generic (PLEG): container finished" podID="39438910-e0d8-4209-8980-377efa5d9e44" containerID="51999975375dcdfbec719606fa70fde0236634b3b21246423a85116c2b20d0ed" exitCode=0 Oct 06 13:30:01 crc kubenswrapper[4892]: I1006 13:30:01.940800 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" event={"ID":"39438910-e0d8-4209-8980-377efa5d9e44","Type":"ContainerDied","Data":"51999975375dcdfbec719606fa70fde0236634b3b21246423a85116c2b20d0ed"} Oct 06 13:30:01 crc kubenswrapper[4892]: I1006 13:30:01.941162 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" event={"ID":"39438910-e0d8-4209-8980-377efa5d9e44","Type":"ContainerStarted","Data":"e8cb18b8157a8c649b1dbe3ef987d2973f05588ca9123a3789a3e2579da579b3"} Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.311494 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.380428 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pk5p\" (UniqueName: \"kubernetes.io/projected/39438910-e0d8-4209-8980-377efa5d9e44-kube-api-access-9pk5p\") pod \"39438910-e0d8-4209-8980-377efa5d9e44\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.380822 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39438910-e0d8-4209-8980-377efa5d9e44-secret-volume\") pod \"39438910-e0d8-4209-8980-377efa5d9e44\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.380951 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39438910-e0d8-4209-8980-377efa5d9e44-config-volume\") pod \"39438910-e0d8-4209-8980-377efa5d9e44\" (UID: \"39438910-e0d8-4209-8980-377efa5d9e44\") " Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.381488 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39438910-e0d8-4209-8980-377efa5d9e44-config-volume" (OuterVolumeSpecName: "config-volume") pod "39438910-e0d8-4209-8980-377efa5d9e44" (UID: "39438910-e0d8-4209-8980-377efa5d9e44"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.386025 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39438910-e0d8-4209-8980-377efa5d9e44-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39438910-e0d8-4209-8980-377efa5d9e44" (UID: "39438910-e0d8-4209-8980-377efa5d9e44"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.386510 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39438910-e0d8-4209-8980-377efa5d9e44-kube-api-access-9pk5p" (OuterVolumeSpecName: "kube-api-access-9pk5p") pod "39438910-e0d8-4209-8980-377efa5d9e44" (UID: "39438910-e0d8-4209-8980-377efa5d9e44"). InnerVolumeSpecName "kube-api-access-9pk5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.483852 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pk5p\" (UniqueName: \"kubernetes.io/projected/39438910-e0d8-4209-8980-377efa5d9e44-kube-api-access-9pk5p\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.484189 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39438910-e0d8-4209-8980-377efa5d9e44-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.484318 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39438910-e0d8-4209-8980-377efa5d9e44-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.995327 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" event={"ID":"39438910-e0d8-4209-8980-377efa5d9e44","Type":"ContainerDied","Data":"e8cb18b8157a8c649b1dbe3ef987d2973f05588ca9123a3789a3e2579da579b3"} Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.995784 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8cb18b8157a8c649b1dbe3ef987d2973f05588ca9123a3789a3e2579da579b3" Oct 06 13:30:03 crc kubenswrapper[4892]: I1006 13:30:03.995378 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329290-7dv9t" Oct 06 13:30:04 crc kubenswrapper[4892]: I1006 13:30:04.389097 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm"] Oct 06 13:30:04 crc kubenswrapper[4892]: I1006 13:30:04.398826 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329245-b58zm"] Oct 06 13:30:06 crc kubenswrapper[4892]: I1006 13:30:06.187775 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6d00e2-2088-4bd1-8269-2337f0421848" path="/var/lib/kubelet/pods/7e6d00e2-2088-4bd1-8269-2337f0421848/volumes" Oct 06 13:31:03 crc kubenswrapper[4892]: I1006 13:31:03.887917 4892 scope.go:117] "RemoveContainer" containerID="40004991028aa620cdd7a290a3f55479f2ee85945621d94e3471ed91dfa56c30" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.235891 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j7xq2"] Oct 06 13:31:41 crc kubenswrapper[4892]: E1006 13:31:41.241922 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39438910-e0d8-4209-8980-377efa5d9e44" containerName="collect-profiles" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.241958 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="39438910-e0d8-4209-8980-377efa5d9e44" containerName="collect-profiles" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.243262 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="39438910-e0d8-4209-8980-377efa5d9e44" containerName="collect-profiles" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.248976 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.265138 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j7xq2"] Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.367672 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kghxh\" (UniqueName: \"kubernetes.io/projected/6d075933-7772-497a-90fb-a74c102f35b9-kube-api-access-kghxh\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.367733 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-catalog-content\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.367867 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-utilities\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.469570 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-utilities\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.469710 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kghxh\" (UniqueName: \"kubernetes.io/projected/6d075933-7772-497a-90fb-a74c102f35b9-kube-api-access-kghxh\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.469735 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-catalog-content\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.470497 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-utilities\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.470575 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-catalog-content\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.491474 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kghxh\" (UniqueName: \"kubernetes.io/projected/6d075933-7772-497a-90fb-a74c102f35b9-kube-api-access-kghxh\") pod \"community-operators-j7xq2\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:41 crc kubenswrapper[4892]: I1006 13:31:41.603145 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:42 crc kubenswrapper[4892]: I1006 13:31:42.109814 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j7xq2"] Oct 06 13:31:42 crc kubenswrapper[4892]: I1006 13:31:42.282887 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7xq2" event={"ID":"6d075933-7772-497a-90fb-a74c102f35b9","Type":"ContainerStarted","Data":"a68d1863b14fb62ea36d40a674fc30aef16456cda37b21792d14d15a7b2db180"} Oct 06 13:31:43 crc kubenswrapper[4892]: I1006 13:31:43.297886 4892 generic.go:334] "Generic (PLEG): container finished" podID="6d075933-7772-497a-90fb-a74c102f35b9" containerID="b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c" exitCode=0 Oct 06 13:31:43 crc kubenswrapper[4892]: I1006 13:31:43.297940 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7xq2" event={"ID":"6d075933-7772-497a-90fb-a74c102f35b9","Type":"ContainerDied","Data":"b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c"} Oct 06 13:31:43 crc kubenswrapper[4892]: I1006 13:31:43.300409 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:31:45 crc kubenswrapper[4892]: I1006 13:31:45.323956 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7xq2" event={"ID":"6d075933-7772-497a-90fb-a74c102f35b9","Type":"ContainerStarted","Data":"1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775"} Oct 06 13:31:46 crc kubenswrapper[4892]: I1006 13:31:46.348047 4892 generic.go:334] "Generic (PLEG): container finished" podID="6d075933-7772-497a-90fb-a74c102f35b9" containerID="1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775" exitCode=0 Oct 06 13:31:46 crc kubenswrapper[4892]: I1006 13:31:46.348474 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7xq2" event={"ID":"6d075933-7772-497a-90fb-a74c102f35b9","Type":"ContainerDied","Data":"1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775"} Oct 06 13:31:47 crc kubenswrapper[4892]: I1006 13:31:47.358703 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7xq2" event={"ID":"6d075933-7772-497a-90fb-a74c102f35b9","Type":"ContainerStarted","Data":"f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88"} Oct 06 13:31:47 crc kubenswrapper[4892]: I1006 13:31:47.381669 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j7xq2" podStartSLOduration=2.796626093 podStartE2EDuration="6.381649958s" podCreationTimestamp="2025-10-06 13:31:41 +0000 UTC" firstStartedPulling="2025-10-06 13:31:43.300122273 +0000 UTC m=+4989.849828048" lastFinishedPulling="2025-10-06 13:31:46.885146138 +0000 UTC m=+4993.434851913" observedRunningTime="2025-10-06 13:31:47.376546271 +0000 UTC m=+4993.926252036" watchObservedRunningTime="2025-10-06 13:31:47.381649958 +0000 UTC m=+4993.931355723" Oct 06 13:31:51 crc kubenswrapper[4892]: I1006 13:31:51.603958 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:51 crc kubenswrapper[4892]: I1006 13:31:51.604454 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:51 crc kubenswrapper[4892]: I1006 13:31:51.676004 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:52 crc kubenswrapper[4892]: I1006 13:31:52.498777 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:52 crc kubenswrapper[4892]: I1006 13:31:52.565138 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j7xq2"] Oct 06 13:31:54 crc kubenswrapper[4892]: I1006 13:31:54.463589 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j7xq2" podUID="6d075933-7772-497a-90fb-a74c102f35b9" containerName="registry-server" containerID="cri-o://f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88" gracePeriod=2 Oct 06 13:31:54 crc kubenswrapper[4892]: I1006 13:31:54.954975 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.064960 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kghxh\" (UniqueName: \"kubernetes.io/projected/6d075933-7772-497a-90fb-a74c102f35b9-kube-api-access-kghxh\") pod \"6d075933-7772-497a-90fb-a74c102f35b9\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.065314 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-catalog-content\") pod \"6d075933-7772-497a-90fb-a74c102f35b9\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.065390 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-utilities\") pod \"6d075933-7772-497a-90fb-a74c102f35b9\" (UID: \"6d075933-7772-497a-90fb-a74c102f35b9\") " Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.067067 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-utilities" (OuterVolumeSpecName: "utilities") pod "6d075933-7772-497a-90fb-a74c102f35b9" (UID: "6d075933-7772-497a-90fb-a74c102f35b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.072647 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d075933-7772-497a-90fb-a74c102f35b9-kube-api-access-kghxh" (OuterVolumeSpecName: "kube-api-access-kghxh") pod "6d075933-7772-497a-90fb-a74c102f35b9" (UID: "6d075933-7772-497a-90fb-a74c102f35b9"). InnerVolumeSpecName "kube-api-access-kghxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.137557 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d075933-7772-497a-90fb-a74c102f35b9" (UID: "6d075933-7772-497a-90fb-a74c102f35b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.169036 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.169476 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d075933-7772-497a-90fb-a74c102f35b9-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.169796 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kghxh\" (UniqueName: \"kubernetes.io/projected/6d075933-7772-497a-90fb-a74c102f35b9-kube-api-access-kghxh\") on node \"crc\" DevicePath \"\"" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.481489 4892 generic.go:334] "Generic (PLEG): container finished" podID="6d075933-7772-497a-90fb-a74c102f35b9" containerID="f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88" exitCode=0 Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.481550 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7xq2" event={"ID":"6d075933-7772-497a-90fb-a74c102f35b9","Type":"ContainerDied","Data":"f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88"} Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.481594 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j7xq2" event={"ID":"6d075933-7772-497a-90fb-a74c102f35b9","Type":"ContainerDied","Data":"a68d1863b14fb62ea36d40a674fc30aef16456cda37b21792d14d15a7b2db180"} Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.481625 4892 scope.go:117] "RemoveContainer" containerID="f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.481815 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j7xq2" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.523763 4892 scope.go:117] "RemoveContainer" containerID="1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.542519 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j7xq2"] Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.556735 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j7xq2"] Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.565272 4892 scope.go:117] "RemoveContainer" containerID="b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.624170 4892 scope.go:117] "RemoveContainer" containerID="f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88" Oct 06 13:31:55 crc kubenswrapper[4892]: E1006 13:31:55.624762 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88\": container with ID starting with f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88 not found: ID does not exist" containerID="f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.624826 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88"} err="failed to get container status \"f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88\": rpc error: code = NotFound desc = could not find container \"f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88\": container with ID starting with f035b4532a41c744356107ac1c18be187692475766cb0198a7f51d1daa6b6b88 not found: ID does not exist" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.624860 4892 scope.go:117] "RemoveContainer" containerID="1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775" Oct 06 13:31:55 crc kubenswrapper[4892]: E1006 13:31:55.625602 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775\": container with ID starting with 1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775 not found: ID does not exist" containerID="1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.625641 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775"} err="failed to get container status \"1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775\": rpc error: code = NotFound desc = could not find container \"1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775\": container with ID starting with 1344b53a4ef10e3e932ec0a8dc8e9742ca60b89a38ba088182f7ce0391f8a775 not found: ID does not exist" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.625671 4892 scope.go:117] "RemoveContainer" containerID="b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c" Oct 06 13:31:55 crc kubenswrapper[4892]: E1006 13:31:55.625995 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c\": container with ID starting with b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c not found: ID does not exist" containerID="b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c" Oct 06 13:31:55 crc kubenswrapper[4892]: I1006 13:31:55.626024 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c"} err="failed to get container status \"b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c\": rpc error: code = NotFound desc = could not find container \"b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c\": container with ID starting with b3c6d58bd20419ea9d4ac02ac200f1dc1f1751f2f59a1887ce63f098ce23721c not found: ID does not exist" Oct 06 13:31:56 crc kubenswrapper[4892]: I1006 13:31:56.184696 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d075933-7772-497a-90fb-a74c102f35b9" path="/var/lib/kubelet/pods/6d075933-7772-497a-90fb-a74c102f35b9/volumes" Oct 06 13:32:22 crc kubenswrapper[4892]: I1006 13:32:22.984237 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:32:22 crc kubenswrapper[4892]: I1006 13:32:22.985000 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:32:52 crc kubenswrapper[4892]: I1006 13:32:52.984710 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:32:52 crc kubenswrapper[4892]: I1006 13:32:52.985519 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:33:22 crc kubenswrapper[4892]: I1006 13:33:22.984910 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:33:22 crc kubenswrapper[4892]: I1006 13:33:22.985441 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:33:22 crc kubenswrapper[4892]: I1006 13:33:22.985489 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:33:22 crc kubenswrapper[4892]: I1006 13:33:22.986350 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"220e2ff1d4e6a2bbe3295946c774510d2c996f6c1166889e882de6ab46c2213f"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:33:22 crc kubenswrapper[4892]: I1006 13:33:22.986420 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://220e2ff1d4e6a2bbe3295946c774510d2c996f6c1166889e882de6ab46c2213f" gracePeriod=600 Oct 06 13:33:23 crc kubenswrapper[4892]: I1006 13:33:23.534815 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="220e2ff1d4e6a2bbe3295946c774510d2c996f6c1166889e882de6ab46c2213f" exitCode=0 Oct 06 13:33:23 crc kubenswrapper[4892]: I1006 13:33:23.534875 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"220e2ff1d4e6a2bbe3295946c774510d2c996f6c1166889e882de6ab46c2213f"} Oct 06 13:33:23 crc kubenswrapper[4892]: I1006 13:33:23.535139 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415"} Oct 06 13:33:23 crc kubenswrapper[4892]: I1006 13:33:23.535165 4892 scope.go:117] "RemoveContainer" containerID="5c9a45fe5544f8ab487b1f298b1bcbb2c9cb4cdac1cf0f8cb9389c52c090a982" Oct 06 13:35:52 crc kubenswrapper[4892]: I1006 13:35:52.984262 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:35:52 crc kubenswrapper[4892]: I1006 13:35:52.984807 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:36:22 crc kubenswrapper[4892]: I1006 13:36:22.984200 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:36:22 crc kubenswrapper[4892]: I1006 13:36:22.984857 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:36:52 crc kubenswrapper[4892]: I1006 13:36:52.984588 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:36:52 crc kubenswrapper[4892]: I1006 13:36:52.985093 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:36:52 crc kubenswrapper[4892]: I1006 13:36:52.985151 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:36:52 crc kubenswrapper[4892]: I1006 13:36:52.986140 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:36:52 crc kubenswrapper[4892]: I1006 13:36:52.986208 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" gracePeriod=600 Oct 06 13:36:53 crc kubenswrapper[4892]: E1006 13:36:53.125428 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:36:54 crc kubenswrapper[4892]: I1006 13:36:54.010451 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" exitCode=0 Oct 06 13:36:54 crc kubenswrapper[4892]: I1006 13:36:54.010737 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415"} Oct 06 13:36:54 crc kubenswrapper[4892]: I1006 13:36:54.010775 4892 scope.go:117] "RemoveContainer" containerID="220e2ff1d4e6a2bbe3295946c774510d2c996f6c1166889e882de6ab46c2213f" Oct 06 13:36:54 crc kubenswrapper[4892]: I1006 13:36:54.011629 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:36:54 crc kubenswrapper[4892]: E1006 13:36:54.012047 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:37:05 crc kubenswrapper[4892]: I1006 13:37:05.169271 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:37:05 crc kubenswrapper[4892]: E1006 13:37:05.170558 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.130611 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zxjgp"] Oct 06 13:37:14 crc kubenswrapper[4892]: E1006 13:37:14.133281 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d075933-7772-497a-90fb-a74c102f35b9" containerName="extract-utilities" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.133458 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d075933-7772-497a-90fb-a74c102f35b9" containerName="extract-utilities" Oct 06 13:37:14 crc kubenswrapper[4892]: E1006 13:37:14.133582 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d075933-7772-497a-90fb-a74c102f35b9" containerName="extract-content" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.133704 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d075933-7772-497a-90fb-a74c102f35b9" containerName="extract-content" Oct 06 13:37:14 crc kubenswrapper[4892]: E1006 13:37:14.133824 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d075933-7772-497a-90fb-a74c102f35b9" containerName="registry-server" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.133938 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d075933-7772-497a-90fb-a74c102f35b9" containerName="registry-server" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.137745 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d075933-7772-497a-90fb-a74c102f35b9" containerName="registry-server" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.141887 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.149077 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxjgp"] Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.223984 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7trdd\" (UniqueName: \"kubernetes.io/projected/446f4859-0c1e-44bc-a7d7-da678a3a14d5-kube-api-access-7trdd\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.224171 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-catalog-content\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.224237 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-utilities\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.325775 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-utilities\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.325870 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7trdd\" (UniqueName: \"kubernetes.io/projected/446f4859-0c1e-44bc-a7d7-da678a3a14d5-kube-api-access-7trdd\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.326058 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-catalog-content\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.326357 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-utilities\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.326416 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-catalog-content\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.374146 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7trdd\" (UniqueName: \"kubernetes.io/projected/446f4859-0c1e-44bc-a7d7-da678a3a14d5-kube-api-access-7trdd\") pod \"redhat-operators-zxjgp\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.464072 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:14 crc kubenswrapper[4892]: I1006 13:37:14.964093 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zxjgp"] Oct 06 13:37:15 crc kubenswrapper[4892]: I1006 13:37:15.266273 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxjgp" event={"ID":"446f4859-0c1e-44bc-a7d7-da678a3a14d5","Type":"ContainerStarted","Data":"d8955d31776ed041dccb971863e0c0b7750107611f4010fb09ae654de9322fb4"} Oct 06 13:37:16 crc kubenswrapper[4892]: I1006 13:37:16.285729 4892 generic.go:334] "Generic (PLEG): container finished" podID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerID="bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e" exitCode=0 Oct 06 13:37:16 crc kubenswrapper[4892]: I1006 13:37:16.285781 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxjgp" event={"ID":"446f4859-0c1e-44bc-a7d7-da678a3a14d5","Type":"ContainerDied","Data":"bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e"} Oct 06 13:37:16 crc kubenswrapper[4892]: I1006 13:37:16.296782 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:37:18 crc kubenswrapper[4892]: I1006 13:37:18.175063 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:37:18 crc kubenswrapper[4892]: E1006 13:37:18.175951 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:37:18 crc kubenswrapper[4892]: I1006 13:37:18.308440 4892 generic.go:334] "Generic (PLEG): container finished" podID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerID="68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665" exitCode=0 Oct 06 13:37:18 crc kubenswrapper[4892]: I1006 13:37:18.308480 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxjgp" event={"ID":"446f4859-0c1e-44bc-a7d7-da678a3a14d5","Type":"ContainerDied","Data":"68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665"} Oct 06 13:37:19 crc kubenswrapper[4892]: I1006 13:37:19.320783 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxjgp" event={"ID":"446f4859-0c1e-44bc-a7d7-da678a3a14d5","Type":"ContainerStarted","Data":"cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5"} Oct 06 13:37:19 crc kubenswrapper[4892]: I1006 13:37:19.345298 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zxjgp" podStartSLOduration=2.89622466 podStartE2EDuration="5.345278773s" podCreationTimestamp="2025-10-06 13:37:14 +0000 UTC" firstStartedPulling="2025-10-06 13:37:16.296359703 +0000 UTC m=+5322.846065478" lastFinishedPulling="2025-10-06 13:37:18.745413786 +0000 UTC m=+5325.295119591" observedRunningTime="2025-10-06 13:37:19.343933624 +0000 UTC m=+5325.893639429" watchObservedRunningTime="2025-10-06 13:37:19.345278773 +0000 UTC m=+5325.894984538" Oct 06 13:37:24 crc kubenswrapper[4892]: I1006 13:37:24.465381 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:24 crc kubenswrapper[4892]: I1006 13:37:24.466265 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:24 crc kubenswrapper[4892]: I1006 13:37:24.551814 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:25 crc kubenswrapper[4892]: I1006 13:37:25.428497 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:25 crc kubenswrapper[4892]: I1006 13:37:25.489524 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxjgp"] Oct 06 13:37:27 crc kubenswrapper[4892]: I1006 13:37:27.418241 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zxjgp" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerName="registry-server" containerID="cri-o://cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5" gracePeriod=2 Oct 06 13:37:27 crc kubenswrapper[4892]: I1006 13:37:27.908151 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.036156 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-utilities\") pod \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.036562 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7trdd\" (UniqueName: \"kubernetes.io/projected/446f4859-0c1e-44bc-a7d7-da678a3a14d5-kube-api-access-7trdd\") pod \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.036620 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-catalog-content\") pod \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\" (UID: \"446f4859-0c1e-44bc-a7d7-da678a3a14d5\") " Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.038173 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-utilities" (OuterVolumeSpecName: "utilities") pod "446f4859-0c1e-44bc-a7d7-da678a3a14d5" (UID: "446f4859-0c1e-44bc-a7d7-da678a3a14d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.042646 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446f4859-0c1e-44bc-a7d7-da678a3a14d5-kube-api-access-7trdd" (OuterVolumeSpecName: "kube-api-access-7trdd") pod "446f4859-0c1e-44bc-a7d7-da678a3a14d5" (UID: "446f4859-0c1e-44bc-a7d7-da678a3a14d5"). InnerVolumeSpecName "kube-api-access-7trdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.116966 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "446f4859-0c1e-44bc-a7d7-da678a3a14d5" (UID: "446f4859-0c1e-44bc-a7d7-da678a3a14d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.138919 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.138963 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f4859-0c1e-44bc-a7d7-da678a3a14d5-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.138978 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7trdd\" (UniqueName: \"kubernetes.io/projected/446f4859-0c1e-44bc-a7d7-da678a3a14d5-kube-api-access-7trdd\") on node \"crc\" DevicePath \"\"" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.430332 4892 generic.go:334] "Generic (PLEG): container finished" podID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerID="cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5" exitCode=0 Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.430372 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxjgp" event={"ID":"446f4859-0c1e-44bc-a7d7-da678a3a14d5","Type":"ContainerDied","Data":"cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5"} Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.430399 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zxjgp" event={"ID":"446f4859-0c1e-44bc-a7d7-da678a3a14d5","Type":"ContainerDied","Data":"d8955d31776ed041dccb971863e0c0b7750107611f4010fb09ae654de9322fb4"} Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.430415 4892 scope.go:117] "RemoveContainer" containerID="cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.430466 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zxjgp" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.458774 4892 scope.go:117] "RemoveContainer" containerID="68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.463979 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zxjgp"] Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.473672 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zxjgp"] Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.482851 4892 scope.go:117] "RemoveContainer" containerID="bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.527911 4892 scope.go:117] "RemoveContainer" containerID="cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5" Oct 06 13:37:28 crc kubenswrapper[4892]: E1006 13:37:28.528441 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5\": container with ID starting with cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5 not found: ID does not exist" containerID="cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.528562 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5"} err="failed to get container status \"cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5\": rpc error: code = NotFound desc = could not find container \"cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5\": container with ID starting with cb6bb3474b821cb73f34644bc6e5d2bba48f82c2eff44164dfa1ace22cf7bae5 not found: ID does not exist" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.528669 4892 scope.go:117] "RemoveContainer" containerID="68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665" Oct 06 13:37:28 crc kubenswrapper[4892]: E1006 13:37:28.529169 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665\": container with ID starting with 68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665 not found: ID does not exist" containerID="68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.529279 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665"} err="failed to get container status \"68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665\": rpc error: code = NotFound desc = could not find container \"68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665\": container with ID starting with 68fc6fed226c4cefa1de872cc1f1ec4eacd831825b38008209f9985ce6bcd665 not found: ID does not exist" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.529403 4892 scope.go:117] "RemoveContainer" containerID="bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e" Oct 06 13:37:28 crc kubenswrapper[4892]: E1006 13:37:28.529760 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e\": container with ID starting with bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e not found: ID does not exist" containerID="bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e" Oct 06 13:37:28 crc kubenswrapper[4892]: I1006 13:37:28.529858 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e"} err="failed to get container status \"bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e\": rpc error: code = NotFound desc = could not find container \"bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e\": container with ID starting with bc7c62b6162ff84ca018d66258c4898ab092ef42771d2dcfdccac38d8d31d41e not found: ID does not exist" Oct 06 13:37:30 crc kubenswrapper[4892]: I1006 13:37:30.180363 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" path="/var/lib/kubelet/pods/446f4859-0c1e-44bc-a7d7-da678a3a14d5/volumes" Oct 06 13:37:33 crc kubenswrapper[4892]: I1006 13:37:33.169282 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:37:33 crc kubenswrapper[4892]: E1006 13:37:33.170474 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:37:44 crc kubenswrapper[4892]: I1006 13:37:44.182259 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:37:44 crc kubenswrapper[4892]: E1006 13:37:44.183360 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:37:56 crc kubenswrapper[4892]: I1006 13:37:56.168278 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:37:56 crc kubenswrapper[4892]: E1006 13:37:56.169106 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:38:07 crc kubenswrapper[4892]: I1006 13:38:07.168985 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:38:07 crc kubenswrapper[4892]: E1006 13:38:07.170216 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:38:21 crc kubenswrapper[4892]: I1006 13:38:21.168657 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:38:21 crc kubenswrapper[4892]: E1006 13:38:21.169601 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:38:32 crc kubenswrapper[4892]: I1006 13:38:32.169201 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:38:32 crc kubenswrapper[4892]: E1006 13:38:32.170174 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:38:46 crc kubenswrapper[4892]: I1006 13:38:46.169425 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:38:46 crc kubenswrapper[4892]: E1006 13:38:46.170656 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:38:59 crc kubenswrapper[4892]: I1006 13:38:59.170251 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:38:59 crc kubenswrapper[4892]: E1006 13:38:59.171403 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:39:13 crc kubenswrapper[4892]: I1006 13:39:13.169524 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:39:13 crc kubenswrapper[4892]: E1006 13:39:13.170703 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.572121 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qmxsx"] Oct 06 13:39:23 crc kubenswrapper[4892]: E1006 13:39:23.574826 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerName="extract-utilities" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.574856 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerName="extract-utilities" Oct 06 13:39:23 crc kubenswrapper[4892]: E1006 13:39:23.574884 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerName="extract-content" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.574898 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerName="extract-content" Oct 06 13:39:23 crc kubenswrapper[4892]: E1006 13:39:23.574933 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerName="registry-server" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.574947 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerName="registry-server" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.575349 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="446f4859-0c1e-44bc-a7d7-da678a3a14d5" containerName="registry-server" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.577816 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.591535 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-catalog-content\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.591856 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf86r\" (UniqueName: \"kubernetes.io/projected/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-kube-api-access-kf86r\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.592189 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-utilities\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.639775 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmxsx"] Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.694259 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-catalog-content\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.694305 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf86r\" (UniqueName: \"kubernetes.io/projected/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-kube-api-access-kf86r\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.694431 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-utilities\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.694846 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-catalog-content\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.694887 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-utilities\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.727365 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf86r\" (UniqueName: \"kubernetes.io/projected/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-kube-api-access-kf86r\") pod \"redhat-marketplace-qmxsx\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:23 crc kubenswrapper[4892]: I1006 13:39:23.913084 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:24 crc kubenswrapper[4892]: I1006 13:39:24.382178 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmxsx"] Oct 06 13:39:24 crc kubenswrapper[4892]: I1006 13:39:24.760143 4892 generic.go:334] "Generic (PLEG): container finished" podID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerID="9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d" exitCode=0 Oct 06 13:39:24 crc kubenswrapper[4892]: I1006 13:39:24.760248 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmxsx" event={"ID":"95a13839-a2b9-4ee0-b2e0-ebd343376bbd","Type":"ContainerDied","Data":"9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d"} Oct 06 13:39:24 crc kubenswrapper[4892]: I1006 13:39:24.760509 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmxsx" event={"ID":"95a13839-a2b9-4ee0-b2e0-ebd343376bbd","Type":"ContainerStarted","Data":"26e26ce6e0404b778664ffa050cb999adfa9954f6d3dcf819b4983d103667c8c"} Oct 06 13:39:26 crc kubenswrapper[4892]: I1006 13:39:26.800877 4892 generic.go:334] "Generic (PLEG): container finished" podID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerID="a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692" exitCode=0 Oct 06 13:39:26 crc kubenswrapper[4892]: I1006 13:39:26.801024 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmxsx" event={"ID":"95a13839-a2b9-4ee0-b2e0-ebd343376bbd","Type":"ContainerDied","Data":"a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692"} Oct 06 13:39:27 crc kubenswrapper[4892]: I1006 13:39:27.818665 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmxsx" event={"ID":"95a13839-a2b9-4ee0-b2e0-ebd343376bbd","Type":"ContainerStarted","Data":"2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4"} Oct 06 13:39:28 crc kubenswrapper[4892]: I1006 13:39:28.169991 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:39:28 crc kubenswrapper[4892]: E1006 13:39:28.170289 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:39:33 crc kubenswrapper[4892]: I1006 13:39:33.913898 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:33 crc kubenswrapper[4892]: I1006 13:39:33.914529 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:33 crc kubenswrapper[4892]: I1006 13:39:33.985976 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:34 crc kubenswrapper[4892]: I1006 13:39:34.041549 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qmxsx" podStartSLOduration=8.530713496 podStartE2EDuration="11.041521444s" podCreationTimestamp="2025-10-06 13:39:23 +0000 UTC" firstStartedPulling="2025-10-06 13:39:24.763016564 +0000 UTC m=+5451.312722379" lastFinishedPulling="2025-10-06 13:39:27.273824552 +0000 UTC m=+5453.823530327" observedRunningTime="2025-10-06 13:39:27.847941195 +0000 UTC m=+5454.397646960" watchObservedRunningTime="2025-10-06 13:39:34.041521444 +0000 UTC m=+5460.591227249" Oct 06 13:39:34 crc kubenswrapper[4892]: I1006 13:39:34.987886 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:35 crc kubenswrapper[4892]: I1006 13:39:35.042922 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmxsx"] Oct 06 13:39:36 crc kubenswrapper[4892]: I1006 13:39:36.926033 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qmxsx" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerName="registry-server" containerID="cri-o://2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4" gracePeriod=2 Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.507928 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.614704 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-utilities\") pod \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.614840 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-catalog-content\") pod \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.615021 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf86r\" (UniqueName: \"kubernetes.io/projected/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-kube-api-access-kf86r\") pod \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\" (UID: \"95a13839-a2b9-4ee0-b2e0-ebd343376bbd\") " Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.615457 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-utilities" (OuterVolumeSpecName: "utilities") pod "95a13839-a2b9-4ee0-b2e0-ebd343376bbd" (UID: "95a13839-a2b9-4ee0-b2e0-ebd343376bbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.615693 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.626614 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-kube-api-access-kf86r" (OuterVolumeSpecName: "kube-api-access-kf86r") pod "95a13839-a2b9-4ee0-b2e0-ebd343376bbd" (UID: "95a13839-a2b9-4ee0-b2e0-ebd343376bbd"). InnerVolumeSpecName "kube-api-access-kf86r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.626898 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95a13839-a2b9-4ee0-b2e0-ebd343376bbd" (UID: "95a13839-a2b9-4ee0-b2e0-ebd343376bbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.717674 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.718066 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf86r\" (UniqueName: \"kubernetes.io/projected/95a13839-a2b9-4ee0-b2e0-ebd343376bbd-kube-api-access-kf86r\") on node \"crc\" DevicePath \"\"" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.945815 4892 generic.go:334] "Generic (PLEG): container finished" podID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerID="2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4" exitCode=0 Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.945884 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmxsx" event={"ID":"95a13839-a2b9-4ee0-b2e0-ebd343376bbd","Type":"ContainerDied","Data":"2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4"} Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.945928 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qmxsx" event={"ID":"95a13839-a2b9-4ee0-b2e0-ebd343376bbd","Type":"ContainerDied","Data":"26e26ce6e0404b778664ffa050cb999adfa9954f6d3dcf819b4983d103667c8c"} Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.945958 4892 scope.go:117] "RemoveContainer" containerID="2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.945949 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qmxsx" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.985752 4892 scope.go:117] "RemoveContainer" containerID="a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692" Oct 06 13:39:37 crc kubenswrapper[4892]: I1006 13:39:37.999522 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmxsx"] Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.007643 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qmxsx"] Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.019987 4892 scope.go:117] "RemoveContainer" containerID="9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d" Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.081787 4892 scope.go:117] "RemoveContainer" containerID="2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4" Oct 06 13:39:38 crc kubenswrapper[4892]: E1006 13:39:38.082766 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4\": container with ID starting with 2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4 not found: ID does not exist" containerID="2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4" Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.082806 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4"} err="failed to get container status \"2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4\": rpc error: code = NotFound desc = could not find container \"2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4\": container with ID starting with 2b56b0c9d1e9f6dd443f18b52bc76f7219cac805e6c5b0adc5135489f51220e4 not found: ID does not exist" Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.082833 4892 scope.go:117] "RemoveContainer" containerID="a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692" Oct 06 13:39:38 crc kubenswrapper[4892]: E1006 13:39:38.083138 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692\": container with ID starting with a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692 not found: ID does not exist" containerID="a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692" Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.083174 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692"} err="failed to get container status \"a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692\": rpc error: code = NotFound desc = could not find container \"a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692\": container with ID starting with a10d8f6e476a7b78f0d4cfa97d31f171f11b99ffae0cfc403db4ad903435c692 not found: ID does not exist" Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.083209 4892 scope.go:117] "RemoveContainer" containerID="9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d" Oct 06 13:39:38 crc kubenswrapper[4892]: E1006 13:39:38.083565 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d\": container with ID starting with 9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d not found: ID does not exist" containerID="9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d" Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.083583 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d"} err="failed to get container status \"9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d\": rpc error: code = NotFound desc = could not find container \"9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d\": container with ID starting with 9e8e6fdee9de418dca86f18ac1ad58ddaea3946a0ebf0049f351d57851316b9d not found: ID does not exist" Oct 06 13:39:38 crc kubenswrapper[4892]: I1006 13:39:38.181685 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" path="/var/lib/kubelet/pods/95a13839-a2b9-4ee0-b2e0-ebd343376bbd/volumes" Oct 06 13:39:41 crc kubenswrapper[4892]: I1006 13:39:41.168673 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:39:41 crc kubenswrapper[4892]: E1006 13:39:41.169508 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:39:55 crc kubenswrapper[4892]: I1006 13:39:55.168425 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:39:55 crc kubenswrapper[4892]: E1006 13:39:55.169147 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:40:07 crc kubenswrapper[4892]: I1006 13:40:07.168984 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:40:07 crc kubenswrapper[4892]: E1006 13:40:07.170406 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:40:20 crc kubenswrapper[4892]: I1006 13:40:20.185054 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:40:20 crc kubenswrapper[4892]: E1006 13:40:20.191735 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:40:35 crc kubenswrapper[4892]: I1006 13:40:35.169848 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:40:35 crc kubenswrapper[4892]: E1006 13:40:35.172047 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:40:48 crc kubenswrapper[4892]: I1006 13:40:48.171458 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:40:48 crc kubenswrapper[4892]: E1006 13:40:48.172307 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:41:02 crc kubenswrapper[4892]: I1006 13:41:02.168405 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:41:02 crc kubenswrapper[4892]: E1006 13:41:02.169430 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:41:14 crc kubenswrapper[4892]: I1006 13:41:14.181448 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:41:14 crc kubenswrapper[4892]: E1006 13:41:14.182533 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:41:27 crc kubenswrapper[4892]: I1006 13:41:27.168964 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:41:27 crc kubenswrapper[4892]: E1006 13:41:27.170077 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:41:28 crc kubenswrapper[4892]: I1006 13:41:28.854747 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hn9v"] Oct 06 13:41:28 crc kubenswrapper[4892]: E1006 13:41:28.855561 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerName="extract-content" Oct 06 13:41:28 crc kubenswrapper[4892]: I1006 13:41:28.855579 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerName="extract-content" Oct 06 13:41:28 crc kubenswrapper[4892]: E1006 13:41:28.855596 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerName="registry-server" Oct 06 13:41:28 crc kubenswrapper[4892]: I1006 13:41:28.855606 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerName="registry-server" Oct 06 13:41:28 crc kubenswrapper[4892]: E1006 13:41:28.855628 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerName="extract-utilities" Oct 06 13:41:28 crc kubenswrapper[4892]: I1006 13:41:28.855637 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerName="extract-utilities" Oct 06 13:41:28 crc kubenswrapper[4892]: I1006 13:41:28.855942 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="95a13839-a2b9-4ee0-b2e0-ebd343376bbd" containerName="registry-server" Oct 06 13:41:28 crc kubenswrapper[4892]: I1006 13:41:28.858374 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:28 crc kubenswrapper[4892]: I1006 13:41:28.875709 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hn9v"] Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.010452 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-catalog-content\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.010520 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45mhj\" (UniqueName: \"kubernetes.io/projected/5bcf6841-2600-4c92-bed4-eb5aad47f66d-kube-api-access-45mhj\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.010554 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-utilities\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.112409 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-catalog-content\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.112593 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45mhj\" (UniqueName: \"kubernetes.io/projected/5bcf6841-2600-4c92-bed4-eb5aad47f66d-kube-api-access-45mhj\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.112901 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-catalog-content\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.113226 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-utilities\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.113667 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-utilities\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.132785 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45mhj\" (UniqueName: \"kubernetes.io/projected/5bcf6841-2600-4c92-bed4-eb5aad47f66d-kube-api-access-45mhj\") pod \"certified-operators-2hn9v\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.192776 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:29 crc kubenswrapper[4892]: I1006 13:41:29.682496 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hn9v"] Oct 06 13:41:31 crc kubenswrapper[4892]: I1006 13:41:31.259648 4892 generic.go:334] "Generic (PLEG): container finished" podID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerID="444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb" exitCode=0 Oct 06 13:41:31 crc kubenswrapper[4892]: I1006 13:41:31.259819 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hn9v" event={"ID":"5bcf6841-2600-4c92-bed4-eb5aad47f66d","Type":"ContainerDied","Data":"444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb"} Oct 06 13:41:31 crc kubenswrapper[4892]: I1006 13:41:31.261102 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hn9v" event={"ID":"5bcf6841-2600-4c92-bed4-eb5aad47f66d","Type":"ContainerStarted","Data":"3b9586d2595444760b3442dccfd9babf8bc4bb80964c96e1d20a031caaed0a0c"} Oct 06 13:41:33 crc kubenswrapper[4892]: I1006 13:41:33.290690 4892 generic.go:334] "Generic (PLEG): container finished" podID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerID="f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1" exitCode=0 Oct 06 13:41:33 crc kubenswrapper[4892]: I1006 13:41:33.290791 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hn9v" event={"ID":"5bcf6841-2600-4c92-bed4-eb5aad47f66d","Type":"ContainerDied","Data":"f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1"} Oct 06 13:41:34 crc kubenswrapper[4892]: I1006 13:41:34.304573 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hn9v" event={"ID":"5bcf6841-2600-4c92-bed4-eb5aad47f66d","Type":"ContainerStarted","Data":"c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c"} Oct 06 13:41:34 crc kubenswrapper[4892]: I1006 13:41:34.344762 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hn9v" podStartSLOduration=3.8992952929999998 podStartE2EDuration="6.344738441s" podCreationTimestamp="2025-10-06 13:41:28 +0000 UTC" firstStartedPulling="2025-10-06 13:41:31.263551088 +0000 UTC m=+5577.813256853" lastFinishedPulling="2025-10-06 13:41:33.708994196 +0000 UTC m=+5580.258700001" observedRunningTime="2025-10-06 13:41:34.329361957 +0000 UTC m=+5580.879067732" watchObservedRunningTime="2025-10-06 13:41:34.344738441 +0000 UTC m=+5580.894444216" Oct 06 13:41:39 crc kubenswrapper[4892]: I1006 13:41:39.193233 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:39 crc kubenswrapper[4892]: I1006 13:41:39.193992 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:39 crc kubenswrapper[4892]: I1006 13:41:39.258699 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:39 crc kubenswrapper[4892]: I1006 13:41:39.422704 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:39 crc kubenswrapper[4892]: I1006 13:41:39.496075 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hn9v"] Oct 06 13:41:41 crc kubenswrapper[4892]: I1006 13:41:41.395934 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hn9v" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerName="registry-server" containerID="cri-o://c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c" gracePeriod=2 Oct 06 13:41:41 crc kubenswrapper[4892]: I1006 13:41:41.898959 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.003772 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-utilities\") pod \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.003924 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-catalog-content\") pod \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.004009 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45mhj\" (UniqueName: \"kubernetes.io/projected/5bcf6841-2600-4c92-bed4-eb5aad47f66d-kube-api-access-45mhj\") pod \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\" (UID: \"5bcf6841-2600-4c92-bed4-eb5aad47f66d\") " Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.005275 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-utilities" (OuterVolumeSpecName: "utilities") pod "5bcf6841-2600-4c92-bed4-eb5aad47f66d" (UID: "5bcf6841-2600-4c92-bed4-eb5aad47f66d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.006822 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.012800 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcf6841-2600-4c92-bed4-eb5aad47f66d-kube-api-access-45mhj" (OuterVolumeSpecName: "kube-api-access-45mhj") pod "5bcf6841-2600-4c92-bed4-eb5aad47f66d" (UID: "5bcf6841-2600-4c92-bed4-eb5aad47f66d"). InnerVolumeSpecName "kube-api-access-45mhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.054169 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5bcf6841-2600-4c92-bed4-eb5aad47f66d" (UID: "5bcf6841-2600-4c92-bed4-eb5aad47f66d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.108049 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5bcf6841-2600-4c92-bed4-eb5aad47f66d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.108079 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45mhj\" (UniqueName: \"kubernetes.io/projected/5bcf6841-2600-4c92-bed4-eb5aad47f66d-kube-api-access-45mhj\") on node \"crc\" DevicePath \"\"" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.169696 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:41:42 crc kubenswrapper[4892]: E1006 13:41:42.170166 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.412406 4892 generic.go:334] "Generic (PLEG): container finished" podID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerID="c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c" exitCode=0 Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.412459 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hn9v" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.412482 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hn9v" event={"ID":"5bcf6841-2600-4c92-bed4-eb5aad47f66d","Type":"ContainerDied","Data":"c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c"} Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.413424 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hn9v" event={"ID":"5bcf6841-2600-4c92-bed4-eb5aad47f66d","Type":"ContainerDied","Data":"3b9586d2595444760b3442dccfd9babf8bc4bb80964c96e1d20a031caaed0a0c"} Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.413454 4892 scope.go:117] "RemoveContainer" containerID="c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.453936 4892 scope.go:117] "RemoveContainer" containerID="f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.464580 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hn9v"] Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.481519 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hn9v"] Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.482417 4892 scope.go:117] "RemoveContainer" containerID="444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.572549 4892 scope.go:117] "RemoveContainer" containerID="c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c" Oct 06 13:41:42 crc kubenswrapper[4892]: E1006 13:41:42.573011 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c\": container with ID starting with c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c not found: ID does not exist" containerID="c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.573088 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c"} err="failed to get container status \"c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c\": rpc error: code = NotFound desc = could not find container \"c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c\": container with ID starting with c49180bf8e45f13463d96f295a66c3649c8d6c6d590885ff4e8abd86345a6b8c not found: ID does not exist" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.573118 4892 scope.go:117] "RemoveContainer" containerID="f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1" Oct 06 13:41:42 crc kubenswrapper[4892]: E1006 13:41:42.573602 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1\": container with ID starting with f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1 not found: ID does not exist" containerID="f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.573625 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1"} err="failed to get container status \"f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1\": rpc error: code = NotFound desc = could not find container \"f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1\": container with ID starting with f3449dee1e53f9344f4808442bdc9716ac16c8184d011a9c7a5a0f8aaa4e9ef1 not found: ID does not exist" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.573644 4892 scope.go:117] "RemoveContainer" containerID="444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb" Oct 06 13:41:42 crc kubenswrapper[4892]: E1006 13:41:42.573963 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb\": container with ID starting with 444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb not found: ID does not exist" containerID="444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb" Oct 06 13:41:42 crc kubenswrapper[4892]: I1006 13:41:42.573987 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb"} err="failed to get container status \"444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb\": rpc error: code = NotFound desc = could not find container \"444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb\": container with ID starting with 444ea1882b0412e8c799dfedd556b2c186e59379b8714bca31b1f11a5761bbfb not found: ID does not exist" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.310885 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2k5lg"] Oct 06 13:41:43 crc kubenswrapper[4892]: E1006 13:41:43.311501 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerName="extract-utilities" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.311513 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerName="extract-utilities" Oct 06 13:41:43 crc kubenswrapper[4892]: E1006 13:41:43.311531 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerName="extract-content" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.311537 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerName="extract-content" Oct 06 13:41:43 crc kubenswrapper[4892]: E1006 13:41:43.311575 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerName="registry-server" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.311581 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerName="registry-server" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.311749 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" containerName="registry-server" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.313152 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.328795 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-utilities\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.328985 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-catalog-content\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.329049 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k5zd\" (UniqueName: \"kubernetes.io/projected/bd07285d-7cfd-471c-b26f-3fb4aadeb716-kube-api-access-6k5zd\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.362291 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2k5lg"] Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.431428 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-catalog-content\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.431496 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k5zd\" (UniqueName: \"kubernetes.io/projected/bd07285d-7cfd-471c-b26f-3fb4aadeb716-kube-api-access-6k5zd\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.431615 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-utilities\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.432118 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-catalog-content\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.432120 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-utilities\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.452457 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k5zd\" (UniqueName: \"kubernetes.io/projected/bd07285d-7cfd-471c-b26f-3fb4aadeb716-kube-api-access-6k5zd\") pod \"community-operators-2k5lg\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:43 crc kubenswrapper[4892]: I1006 13:41:43.650381 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:44 crc kubenswrapper[4892]: I1006 13:41:44.181851 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcf6841-2600-4c92-bed4-eb5aad47f66d" path="/var/lib/kubelet/pods/5bcf6841-2600-4c92-bed4-eb5aad47f66d/volumes" Oct 06 13:41:44 crc kubenswrapper[4892]: I1006 13:41:44.273383 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2k5lg"] Oct 06 13:41:45 crc kubenswrapper[4892]: I1006 13:41:45.450012 4892 generic.go:334] "Generic (PLEG): container finished" podID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerID="aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9" exitCode=0 Oct 06 13:41:45 crc kubenswrapper[4892]: I1006 13:41:45.450496 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2k5lg" event={"ID":"bd07285d-7cfd-471c-b26f-3fb4aadeb716","Type":"ContainerDied","Data":"aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9"} Oct 06 13:41:45 crc kubenswrapper[4892]: I1006 13:41:45.450601 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2k5lg" event={"ID":"bd07285d-7cfd-471c-b26f-3fb4aadeb716","Type":"ContainerStarted","Data":"e51a3ec6fe6ab8dfc045469f6926e8792a26627d75565258e84cd7d8d3b0997e"} Oct 06 13:41:48 crc kubenswrapper[4892]: I1006 13:41:48.493807 4892 generic.go:334] "Generic (PLEG): container finished" podID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerID="220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687" exitCode=0 Oct 06 13:41:48 crc kubenswrapper[4892]: I1006 13:41:48.493861 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2k5lg" event={"ID":"bd07285d-7cfd-471c-b26f-3fb4aadeb716","Type":"ContainerDied","Data":"220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687"} Oct 06 13:41:49 crc kubenswrapper[4892]: I1006 13:41:49.506365 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2k5lg" event={"ID":"bd07285d-7cfd-471c-b26f-3fb4aadeb716","Type":"ContainerStarted","Data":"35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7"} Oct 06 13:41:53 crc kubenswrapper[4892]: I1006 13:41:53.652153 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:53 crc kubenswrapper[4892]: I1006 13:41:53.652646 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:53 crc kubenswrapper[4892]: I1006 13:41:53.737505 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:53 crc kubenswrapper[4892]: I1006 13:41:53.768201 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2k5lg" podStartSLOduration=7.294565547 podStartE2EDuration="10.768173962s" podCreationTimestamp="2025-10-06 13:41:43 +0000 UTC" firstStartedPulling="2025-10-06 13:41:45.452287805 +0000 UTC m=+5592.001993570" lastFinishedPulling="2025-10-06 13:41:48.92589619 +0000 UTC m=+5595.475601985" observedRunningTime="2025-10-06 13:41:49.533539912 +0000 UTC m=+5596.083245677" watchObservedRunningTime="2025-10-06 13:41:53.768173962 +0000 UTC m=+5600.317879767" Oct 06 13:41:54 crc kubenswrapper[4892]: I1006 13:41:54.645885 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:54 crc kubenswrapper[4892]: I1006 13:41:54.714833 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2k5lg"] Oct 06 13:41:56 crc kubenswrapper[4892]: I1006 13:41:56.591867 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2k5lg" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerName="registry-server" containerID="cri-o://35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7" gracePeriod=2 Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.132757 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.149048 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-catalog-content\") pod \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.149637 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-utilities\") pod \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.149726 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k5zd\" (UniqueName: \"kubernetes.io/projected/bd07285d-7cfd-471c-b26f-3fb4aadeb716-kube-api-access-6k5zd\") pod \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\" (UID: \"bd07285d-7cfd-471c-b26f-3fb4aadeb716\") " Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.152908 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-utilities" (OuterVolumeSpecName: "utilities") pod "bd07285d-7cfd-471c-b26f-3fb4aadeb716" (UID: "bd07285d-7cfd-471c-b26f-3fb4aadeb716"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.158609 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd07285d-7cfd-471c-b26f-3fb4aadeb716-kube-api-access-6k5zd" (OuterVolumeSpecName: "kube-api-access-6k5zd") pod "bd07285d-7cfd-471c-b26f-3fb4aadeb716" (UID: "bd07285d-7cfd-471c-b26f-3fb4aadeb716"). InnerVolumeSpecName "kube-api-access-6k5zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.177542 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.241717 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd07285d-7cfd-471c-b26f-3fb4aadeb716" (UID: "bd07285d-7cfd-471c-b26f-3fb4aadeb716"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.251739 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.251771 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k5zd\" (UniqueName: \"kubernetes.io/projected/bd07285d-7cfd-471c-b26f-3fb4aadeb716-kube-api-access-6k5zd\") on node \"crc\" DevicePath \"\"" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.251781 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd07285d-7cfd-471c-b26f-3fb4aadeb716-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.630817 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"8bcfca9978bab671bddad6fc7de728498ced079135c08b28d3faacf608ba6cf0"} Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.634076 4892 generic.go:334] "Generic (PLEG): container finished" podID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerID="35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7" exitCode=0 Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.634135 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2k5lg" event={"ID":"bd07285d-7cfd-471c-b26f-3fb4aadeb716","Type":"ContainerDied","Data":"35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7"} Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.634163 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2k5lg" event={"ID":"bd07285d-7cfd-471c-b26f-3fb4aadeb716","Type":"ContainerDied","Data":"e51a3ec6fe6ab8dfc045469f6926e8792a26627d75565258e84cd7d8d3b0997e"} Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.634200 4892 scope.go:117] "RemoveContainer" containerID="35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.634371 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2k5lg" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.682541 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2k5lg"] Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.692999 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2k5lg"] Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.700799 4892 scope.go:117] "RemoveContainer" containerID="220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.725817 4892 scope.go:117] "RemoveContainer" containerID="aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.783418 4892 scope.go:117] "RemoveContainer" containerID="35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7" Oct 06 13:41:57 crc kubenswrapper[4892]: E1006 13:41:57.784034 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7\": container with ID starting with 35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7 not found: ID does not exist" containerID="35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.784076 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7"} err="failed to get container status \"35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7\": rpc error: code = NotFound desc = could not find container \"35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7\": container with ID starting with 35826214afc58753813c91456c61cdad44a7c5801cabd6463668afca06f3b2a7 not found: ID does not exist" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.784104 4892 scope.go:117] "RemoveContainer" containerID="220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687" Oct 06 13:41:57 crc kubenswrapper[4892]: E1006 13:41:57.784341 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687\": container with ID starting with 220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687 not found: ID does not exist" containerID="220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.784367 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687"} err="failed to get container status \"220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687\": rpc error: code = NotFound desc = could not find container \"220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687\": container with ID starting with 220eda9271289a296a7a949dd4b4d3548dc68532b36d72e3bfd37bdeb0a41687 not found: ID does not exist" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.784386 4892 scope.go:117] "RemoveContainer" containerID="aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9" Oct 06 13:41:57 crc kubenswrapper[4892]: E1006 13:41:57.784788 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9\": container with ID starting with aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9 not found: ID does not exist" containerID="aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9" Oct 06 13:41:57 crc kubenswrapper[4892]: I1006 13:41:57.784827 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9"} err="failed to get container status \"aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9\": rpc error: code = NotFound desc = could not find container \"aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9\": container with ID starting with aa9cf2e0d5797d6e80890d0a293bb32ff714fdd96bce2b075814dd3d107637a9 not found: ID does not exist" Oct 06 13:41:58 crc kubenswrapper[4892]: I1006 13:41:58.190617 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" path="/var/lib/kubelet/pods/bd07285d-7cfd-471c-b26f-3fb4aadeb716/volumes" Oct 06 13:44:22 crc kubenswrapper[4892]: I1006 13:44:22.984730 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:44:22 crc kubenswrapper[4892]: I1006 13:44:22.985466 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:44:52 crc kubenswrapper[4892]: I1006 13:44:52.984976 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:44:52 crc kubenswrapper[4892]: I1006 13:44:52.985895 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.187052 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn"] Oct 06 13:45:00 crc kubenswrapper[4892]: E1006 13:45:00.188777 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerName="registry-server" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.188799 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerName="registry-server" Oct 06 13:45:00 crc kubenswrapper[4892]: E1006 13:45:00.188853 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerName="extract-content" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.188864 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerName="extract-content" Oct 06 13:45:00 crc kubenswrapper[4892]: E1006 13:45:00.188888 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerName="extract-utilities" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.188904 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerName="extract-utilities" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.189776 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd07285d-7cfd-471c-b26f-3fb4aadeb716" containerName="registry-server" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.191502 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.203606 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.210230 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.212372 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn"] Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.250089 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-secret-volume\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.250608 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxn7t\" (UniqueName: \"kubernetes.io/projected/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-kube-api-access-kxn7t\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.251064 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-config-volume\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.353415 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxn7t\" (UniqueName: \"kubernetes.io/projected/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-kube-api-access-kxn7t\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.353568 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-config-volume\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.353721 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-secret-volume\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.357275 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-config-volume\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.363567 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-secret-volume\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.371131 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxn7t\" (UniqueName: \"kubernetes.io/projected/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-kube-api-access-kxn7t\") pod \"collect-profiles-29329305-9b5fn\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:00 crc kubenswrapper[4892]: I1006 13:45:00.529036 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:01 crc kubenswrapper[4892]: I1006 13:45:01.005433 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn"] Oct 06 13:45:01 crc kubenswrapper[4892]: I1006 13:45:01.677227 4892 generic.go:334] "Generic (PLEG): container finished" podID="26990a17-eeaa-4516-a6c8-cf9d94b29b5b" containerID="9efdcb21e87d9327bad22e2859d0527c41d0d2a06f0e5df33a6741ba3576d59e" exitCode=0 Oct 06 13:45:01 crc kubenswrapper[4892]: I1006 13:45:01.677599 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" event={"ID":"26990a17-eeaa-4516-a6c8-cf9d94b29b5b","Type":"ContainerDied","Data":"9efdcb21e87d9327bad22e2859d0527c41d0d2a06f0e5df33a6741ba3576d59e"} Oct 06 13:45:01 crc kubenswrapper[4892]: I1006 13:45:01.677638 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" event={"ID":"26990a17-eeaa-4516-a6c8-cf9d94b29b5b","Type":"ContainerStarted","Data":"65bd28e1c6effbc9ed0f87ef52d1600e149296a2e406df0ae3cc10bd783088e9"} Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.126581 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.218892 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-secret-volume\") pod \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.218984 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxn7t\" (UniqueName: \"kubernetes.io/projected/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-kube-api-access-kxn7t\") pod \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.219297 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-config-volume\") pod \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\" (UID: \"26990a17-eeaa-4516-a6c8-cf9d94b29b5b\") " Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.219782 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-config-volume" (OuterVolumeSpecName: "config-volume") pod "26990a17-eeaa-4516-a6c8-cf9d94b29b5b" (UID: "26990a17-eeaa-4516-a6c8-cf9d94b29b5b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.220471 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.232484 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-kube-api-access-kxn7t" (OuterVolumeSpecName: "kube-api-access-kxn7t") pod "26990a17-eeaa-4516-a6c8-cf9d94b29b5b" (UID: "26990a17-eeaa-4516-a6c8-cf9d94b29b5b"). InnerVolumeSpecName "kube-api-access-kxn7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.237066 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "26990a17-eeaa-4516-a6c8-cf9d94b29b5b" (UID: "26990a17-eeaa-4516-a6c8-cf9d94b29b5b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.321929 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.321984 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxn7t\" (UniqueName: \"kubernetes.io/projected/26990a17-eeaa-4516-a6c8-cf9d94b29b5b-kube-api-access-kxn7t\") on node \"crc\" DevicePath \"\"" Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.701829 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" event={"ID":"26990a17-eeaa-4516-a6c8-cf9d94b29b5b","Type":"ContainerDied","Data":"65bd28e1c6effbc9ed0f87ef52d1600e149296a2e406df0ae3cc10bd783088e9"} Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.701873 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65bd28e1c6effbc9ed0f87ef52d1600e149296a2e406df0ae3cc10bd783088e9" Oct 06 13:45:03 crc kubenswrapper[4892]: I1006 13:45:03.701930 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329305-9b5fn" Oct 06 13:45:04 crc kubenswrapper[4892]: I1006 13:45:04.215370 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq"] Oct 06 13:45:04 crc kubenswrapper[4892]: I1006 13:45:04.232119 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329260-jpcfq"] Oct 06 13:45:06 crc kubenswrapper[4892]: I1006 13:45:06.181637 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="316d8af4-af87-4abf-b86a-3059a5f365ec" path="/var/lib/kubelet/pods/316d8af4-af87-4abf-b86a-3059a5f365ec/volumes" Oct 06 13:45:22 crc kubenswrapper[4892]: I1006 13:45:22.984267 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:45:22 crc kubenswrapper[4892]: I1006 13:45:22.984949 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:45:22 crc kubenswrapper[4892]: I1006 13:45:22.985010 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:45:22 crc kubenswrapper[4892]: I1006 13:45:22.986435 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bcfca9978bab671bddad6fc7de728498ced079135c08b28d3faacf608ba6cf0"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:45:22 crc kubenswrapper[4892]: I1006 13:45:22.986523 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://8bcfca9978bab671bddad6fc7de728498ced079135c08b28d3faacf608ba6cf0" gracePeriod=600 Oct 06 13:45:23 crc kubenswrapper[4892]: I1006 13:45:23.925309 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="8bcfca9978bab671bddad6fc7de728498ced079135c08b28d3faacf608ba6cf0" exitCode=0 Oct 06 13:45:23 crc kubenswrapper[4892]: I1006 13:45:23.925974 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"8bcfca9978bab671bddad6fc7de728498ced079135c08b28d3faacf608ba6cf0"} Oct 06 13:45:23 crc kubenswrapper[4892]: I1006 13:45:23.926001 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e"} Oct 06 13:45:23 crc kubenswrapper[4892]: I1006 13:45:23.926018 4892 scope.go:117] "RemoveContainer" containerID="ff89740ac34a143b719bfe5482e68ec79dac1c56f9a08b80feb6d9cbef722415" Oct 06 13:46:04 crc kubenswrapper[4892]: I1006 13:46:04.305796 4892 scope.go:117] "RemoveContainer" containerID="6000c519e8ad01f46518a4edf9bda39dd0b7a08be6add40af33ecf5e11f5d034" Oct 06 13:47:52 crc kubenswrapper[4892]: I1006 13:47:52.984593 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:47:52 crc kubenswrapper[4892]: I1006 13:47:52.985357 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.411459 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5927r"] Oct 06 13:48:08 crc kubenswrapper[4892]: E1006 13:48:08.412454 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26990a17-eeaa-4516-a6c8-cf9d94b29b5b" containerName="collect-profiles" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.412470 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="26990a17-eeaa-4516-a6c8-cf9d94b29b5b" containerName="collect-profiles" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.412729 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="26990a17-eeaa-4516-a6c8-cf9d94b29b5b" containerName="collect-profiles" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.414534 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.433579 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5927r"] Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.459756 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-catalog-content\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.459847 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-utilities\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.460072 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j77kp\" (UniqueName: \"kubernetes.io/projected/f251d35c-7920-4f4f-89a5-28814d0e18fb-kube-api-access-j77kp\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.562292 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j77kp\" (UniqueName: \"kubernetes.io/projected/f251d35c-7920-4f4f-89a5-28814d0e18fb-kube-api-access-j77kp\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.562371 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-catalog-content\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.562403 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-utilities\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.563032 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-catalog-content\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.563044 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-utilities\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.582110 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j77kp\" (UniqueName: \"kubernetes.io/projected/f251d35c-7920-4f4f-89a5-28814d0e18fb-kube-api-access-j77kp\") pod \"redhat-operators-5927r\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:08 crc kubenswrapper[4892]: I1006 13:48:08.751989 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:09 crc kubenswrapper[4892]: I1006 13:48:09.204766 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5927r"] Oct 06 13:48:09 crc kubenswrapper[4892]: I1006 13:48:09.822051 4892 generic.go:334] "Generic (PLEG): container finished" podID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerID="a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3" exitCode=0 Oct 06 13:48:09 crc kubenswrapper[4892]: I1006 13:48:09.822092 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5927r" event={"ID":"f251d35c-7920-4f4f-89a5-28814d0e18fb","Type":"ContainerDied","Data":"a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3"} Oct 06 13:48:09 crc kubenswrapper[4892]: I1006 13:48:09.822116 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5927r" event={"ID":"f251d35c-7920-4f4f-89a5-28814d0e18fb","Type":"ContainerStarted","Data":"20601f42da4f61cb44269d3f0ce84125ae44a542eba1103c819ac6953e7f2f64"} Oct 06 13:48:09 crc kubenswrapper[4892]: I1006 13:48:09.826436 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:48:11 crc kubenswrapper[4892]: I1006 13:48:11.851298 4892 generic.go:334] "Generic (PLEG): container finished" podID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerID="360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7" exitCode=0 Oct 06 13:48:11 crc kubenswrapper[4892]: I1006 13:48:11.851418 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5927r" event={"ID":"f251d35c-7920-4f4f-89a5-28814d0e18fb","Type":"ContainerDied","Data":"360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7"} Oct 06 13:48:13 crc kubenswrapper[4892]: I1006 13:48:13.873679 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5927r" event={"ID":"f251d35c-7920-4f4f-89a5-28814d0e18fb","Type":"ContainerStarted","Data":"6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3"} Oct 06 13:48:13 crc kubenswrapper[4892]: I1006 13:48:13.896667 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5927r" podStartSLOduration=2.336333519 podStartE2EDuration="5.896651171s" podCreationTimestamp="2025-10-06 13:48:08 +0000 UTC" firstStartedPulling="2025-10-06 13:48:09.82603896 +0000 UTC m=+5976.375744725" lastFinishedPulling="2025-10-06 13:48:13.386356602 +0000 UTC m=+5979.936062377" observedRunningTime="2025-10-06 13:48:13.890331808 +0000 UTC m=+5980.440037573" watchObservedRunningTime="2025-10-06 13:48:13.896651171 +0000 UTC m=+5980.446356936" Oct 06 13:48:18 crc kubenswrapper[4892]: I1006 13:48:18.753128 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:18 crc kubenswrapper[4892]: I1006 13:48:18.753748 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:18 crc kubenswrapper[4892]: I1006 13:48:18.813680 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:18 crc kubenswrapper[4892]: I1006 13:48:18.975742 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:19 crc kubenswrapper[4892]: I1006 13:48:19.055168 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5927r"] Oct 06 13:48:20 crc kubenswrapper[4892]: I1006 13:48:20.942212 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5927r" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerName="registry-server" containerID="cri-o://6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3" gracePeriod=2 Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.600684 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.625900 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-catalog-content\") pod \"f251d35c-7920-4f4f-89a5-28814d0e18fb\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.626014 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-utilities\") pod \"f251d35c-7920-4f4f-89a5-28814d0e18fb\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.626080 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j77kp\" (UniqueName: \"kubernetes.io/projected/f251d35c-7920-4f4f-89a5-28814d0e18fb-kube-api-access-j77kp\") pod \"f251d35c-7920-4f4f-89a5-28814d0e18fb\" (UID: \"f251d35c-7920-4f4f-89a5-28814d0e18fb\") " Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.638249 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f251d35c-7920-4f4f-89a5-28814d0e18fb-kube-api-access-j77kp" (OuterVolumeSpecName: "kube-api-access-j77kp") pod "f251d35c-7920-4f4f-89a5-28814d0e18fb" (UID: "f251d35c-7920-4f4f-89a5-28814d0e18fb"). InnerVolumeSpecName "kube-api-access-j77kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.641519 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-utilities" (OuterVolumeSpecName: "utilities") pod "f251d35c-7920-4f4f-89a5-28814d0e18fb" (UID: "f251d35c-7920-4f4f-89a5-28814d0e18fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.725972 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f251d35c-7920-4f4f-89a5-28814d0e18fb" (UID: "f251d35c-7920-4f4f-89a5-28814d0e18fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.728813 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.728848 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f251d35c-7920-4f4f-89a5-28814d0e18fb-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.728860 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j77kp\" (UniqueName: \"kubernetes.io/projected/f251d35c-7920-4f4f-89a5-28814d0e18fb-kube-api-access-j77kp\") on node \"crc\" DevicePath \"\"" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.957883 4892 generic.go:334] "Generic (PLEG): container finished" podID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerID="6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3" exitCode=0 Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.957969 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5927r" event={"ID":"f251d35c-7920-4f4f-89a5-28814d0e18fb","Type":"ContainerDied","Data":"6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3"} Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.958059 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5927r" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.958086 4892 scope.go:117] "RemoveContainer" containerID="6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3" Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.958060 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5927r" event={"ID":"f251d35c-7920-4f4f-89a5-28814d0e18fb","Type":"ContainerDied","Data":"20601f42da4f61cb44269d3f0ce84125ae44a542eba1103c819ac6953e7f2f64"} Oct 06 13:48:21 crc kubenswrapper[4892]: I1006 13:48:21.989684 4892 scope.go:117] "RemoveContainer" containerID="360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.006178 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5927r"] Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.019490 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5927r"] Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.028120 4892 scope.go:117] "RemoveContainer" containerID="a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.067709 4892 scope.go:117] "RemoveContainer" containerID="6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3" Oct 06 13:48:22 crc kubenswrapper[4892]: E1006 13:48:22.068426 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3\": container with ID starting with 6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3 not found: ID does not exist" containerID="6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.068468 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3"} err="failed to get container status \"6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3\": rpc error: code = NotFound desc = could not find container \"6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3\": container with ID starting with 6ce1dc7938b2fdde085eda7918b42850797499e65dd28e067e714a1c52ee8ea3 not found: ID does not exist" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.068496 4892 scope.go:117] "RemoveContainer" containerID="360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7" Oct 06 13:48:22 crc kubenswrapper[4892]: E1006 13:48:22.068786 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7\": container with ID starting with 360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7 not found: ID does not exist" containerID="360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.068812 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7"} err="failed to get container status \"360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7\": rpc error: code = NotFound desc = could not find container \"360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7\": container with ID starting with 360879fa860aa4dd545760f9cc56ec6e887d440245322317907b0945221ac7e7 not found: ID does not exist" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.068826 4892 scope.go:117] "RemoveContainer" containerID="a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3" Oct 06 13:48:22 crc kubenswrapper[4892]: E1006 13:48:22.069198 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3\": container with ID starting with a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3 not found: ID does not exist" containerID="a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.069249 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3"} err="failed to get container status \"a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3\": rpc error: code = NotFound desc = could not find container \"a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3\": container with ID starting with a766c17324b843db273aa567087a165dd5b871c86cdfc45c55d774dbb98408a3 not found: ID does not exist" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.182939 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" path="/var/lib/kubelet/pods/f251d35c-7920-4f4f-89a5-28814d0e18fb/volumes" Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.984295 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:48:22 crc kubenswrapper[4892]: I1006 13:48:22.984613 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:48:52 crc kubenswrapper[4892]: I1006 13:48:52.987588 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:48:52 crc kubenswrapper[4892]: I1006 13:48:52.988017 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:48:52 crc kubenswrapper[4892]: I1006 13:48:52.988058 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:48:52 crc kubenswrapper[4892]: I1006 13:48:52.988731 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:48:52 crc kubenswrapper[4892]: I1006 13:48:52.988775 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" gracePeriod=600 Oct 06 13:48:53 crc kubenswrapper[4892]: E1006 13:48:53.190842 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:48:53 crc kubenswrapper[4892]: I1006 13:48:53.305264 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" exitCode=0 Oct 06 13:48:53 crc kubenswrapper[4892]: I1006 13:48:53.305318 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e"} Oct 06 13:48:53 crc kubenswrapper[4892]: I1006 13:48:53.305463 4892 scope.go:117] "RemoveContainer" containerID="8bcfca9978bab671bddad6fc7de728498ced079135c08b28d3faacf608ba6cf0" Oct 06 13:48:53 crc kubenswrapper[4892]: I1006 13:48:53.306004 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:48:53 crc kubenswrapper[4892]: E1006 13:48:53.306260 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:49:04 crc kubenswrapper[4892]: I1006 13:49:04.176142 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:49:04 crc kubenswrapper[4892]: E1006 13:49:04.177121 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:49:16 crc kubenswrapper[4892]: I1006 13:49:16.168840 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:49:16 crc kubenswrapper[4892]: E1006 13:49:16.169556 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:49:29 crc kubenswrapper[4892]: I1006 13:49:29.168872 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:49:29 crc kubenswrapper[4892]: E1006 13:49:29.169633 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:49:41 crc kubenswrapper[4892]: I1006 13:49:41.170834 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:49:41 crc kubenswrapper[4892]: E1006 13:49:41.172619 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.893526 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k26ds"] Oct 06 13:49:47 crc kubenswrapper[4892]: E1006 13:49:47.894419 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerName="extract-utilities" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.894434 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerName="extract-utilities" Oct 06 13:49:47 crc kubenswrapper[4892]: E1006 13:49:47.894445 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerName="extract-content" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.894451 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerName="extract-content" Oct 06 13:49:47 crc kubenswrapper[4892]: E1006 13:49:47.894466 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerName="registry-server" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.894472 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerName="registry-server" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.894689 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="f251d35c-7920-4f4f-89a5-28814d0e18fb" containerName="registry-server" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.896347 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.918408 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k26ds"] Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.974985 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-catalog-content\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.975165 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-utilities\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:47 crc kubenswrapper[4892]: I1006 13:49:47.975223 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2llnz\" (UniqueName: \"kubernetes.io/projected/3d28053a-e999-4f7d-887c-cb000b698544-kube-api-access-2llnz\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.077743 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-catalog-content\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.078015 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-utilities\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.078074 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2llnz\" (UniqueName: \"kubernetes.io/projected/3d28053a-e999-4f7d-887c-cb000b698544-kube-api-access-2llnz\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.078391 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-catalog-content\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.078491 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-utilities\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.110566 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2llnz\" (UniqueName: \"kubernetes.io/projected/3d28053a-e999-4f7d-887c-cb000b698544-kube-api-access-2llnz\") pod \"redhat-marketplace-k26ds\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.217683 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.688780 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k26ds"] Oct 06 13:49:48 crc kubenswrapper[4892]: I1006 13:49:48.832832 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k26ds" event={"ID":"3d28053a-e999-4f7d-887c-cb000b698544","Type":"ContainerStarted","Data":"13ec876d3e26926b09176cb77b10dcad741f22b1a535e5ecd1c0e198e652cad4"} Oct 06 13:49:49 crc kubenswrapper[4892]: I1006 13:49:49.845082 4892 generic.go:334] "Generic (PLEG): container finished" podID="3d28053a-e999-4f7d-887c-cb000b698544" containerID="5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399" exitCode=0 Oct 06 13:49:49 crc kubenswrapper[4892]: I1006 13:49:49.845146 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k26ds" event={"ID":"3d28053a-e999-4f7d-887c-cb000b698544","Type":"ContainerDied","Data":"5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399"} Oct 06 13:49:51 crc kubenswrapper[4892]: I1006 13:49:51.868064 4892 generic.go:334] "Generic (PLEG): container finished" podID="3d28053a-e999-4f7d-887c-cb000b698544" containerID="3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11" exitCode=0 Oct 06 13:49:51 crc kubenswrapper[4892]: I1006 13:49:51.868490 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k26ds" event={"ID":"3d28053a-e999-4f7d-887c-cb000b698544","Type":"ContainerDied","Data":"3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11"} Oct 06 13:49:52 crc kubenswrapper[4892]: I1006 13:49:52.879400 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k26ds" event={"ID":"3d28053a-e999-4f7d-887c-cb000b698544","Type":"ContainerStarted","Data":"9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209"} Oct 06 13:49:52 crc kubenswrapper[4892]: I1006 13:49:52.917511 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k26ds" podStartSLOduration=3.434018927 podStartE2EDuration="5.917495432s" podCreationTimestamp="2025-10-06 13:49:47 +0000 UTC" firstStartedPulling="2025-10-06 13:49:49.847213182 +0000 UTC m=+6076.396918957" lastFinishedPulling="2025-10-06 13:49:52.330689677 +0000 UTC m=+6078.880395462" observedRunningTime="2025-10-06 13:49:52.909082079 +0000 UTC m=+6079.458787854" watchObservedRunningTime="2025-10-06 13:49:52.917495432 +0000 UTC m=+6079.467201197" Oct 06 13:49:54 crc kubenswrapper[4892]: I1006 13:49:54.174430 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:49:54 crc kubenswrapper[4892]: E1006 13:49:54.175720 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:49:58 crc kubenswrapper[4892]: I1006 13:49:58.218068 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:58 crc kubenswrapper[4892]: I1006 13:49:58.222669 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:58 crc kubenswrapper[4892]: I1006 13:49:58.270425 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:59 crc kubenswrapper[4892]: I1006 13:49:59.009583 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:49:59 crc kubenswrapper[4892]: I1006 13:49:59.066293 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k26ds"] Oct 06 13:50:00 crc kubenswrapper[4892]: I1006 13:50:00.971873 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k26ds" podUID="3d28053a-e999-4f7d-887c-cb000b698544" containerName="registry-server" containerID="cri-o://9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209" gracePeriod=2 Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.467495 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.570776 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2llnz\" (UniqueName: \"kubernetes.io/projected/3d28053a-e999-4f7d-887c-cb000b698544-kube-api-access-2llnz\") pod \"3d28053a-e999-4f7d-887c-cb000b698544\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.571085 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-utilities\") pod \"3d28053a-e999-4f7d-887c-cb000b698544\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.571207 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-catalog-content\") pod \"3d28053a-e999-4f7d-887c-cb000b698544\" (UID: \"3d28053a-e999-4f7d-887c-cb000b698544\") " Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.572015 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-utilities" (OuterVolumeSpecName: "utilities") pod "3d28053a-e999-4f7d-887c-cb000b698544" (UID: "3d28053a-e999-4f7d-887c-cb000b698544"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.578471 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d28053a-e999-4f7d-887c-cb000b698544-kube-api-access-2llnz" (OuterVolumeSpecName: "kube-api-access-2llnz") pod "3d28053a-e999-4f7d-887c-cb000b698544" (UID: "3d28053a-e999-4f7d-887c-cb000b698544"). InnerVolumeSpecName "kube-api-access-2llnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.583172 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d28053a-e999-4f7d-887c-cb000b698544" (UID: "3d28053a-e999-4f7d-887c-cb000b698544"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.674281 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.674345 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d28053a-e999-4f7d-887c-cb000b698544-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.674360 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2llnz\" (UniqueName: \"kubernetes.io/projected/3d28053a-e999-4f7d-887c-cb000b698544-kube-api-access-2llnz\") on node \"crc\" DevicePath \"\"" Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.985183 4892 generic.go:334] "Generic (PLEG): container finished" podID="3d28053a-e999-4f7d-887c-cb000b698544" containerID="9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209" exitCode=0 Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.985233 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k26ds" event={"ID":"3d28053a-e999-4f7d-887c-cb000b698544","Type":"ContainerDied","Data":"9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209"} Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.985267 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k26ds" event={"ID":"3d28053a-e999-4f7d-887c-cb000b698544","Type":"ContainerDied","Data":"13ec876d3e26926b09176cb77b10dcad741f22b1a535e5ecd1c0e198e652cad4"} Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.985290 4892 scope.go:117] "RemoveContainer" containerID="9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209" Oct 06 13:50:01 crc kubenswrapper[4892]: I1006 13:50:01.986145 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k26ds" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.008049 4892 scope.go:117] "RemoveContainer" containerID="3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.029798 4892 scope.go:117] "RemoveContainer" containerID="5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.086020 4892 scope.go:117] "RemoveContainer" containerID="9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209" Oct 06 13:50:02 crc kubenswrapper[4892]: E1006 13:50:02.087953 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209\": container with ID starting with 9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209 not found: ID does not exist" containerID="9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.087990 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209"} err="failed to get container status \"9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209\": rpc error: code = NotFound desc = could not find container \"9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209\": container with ID starting with 9fba7731e1fee0ba37b3c161c86c5202a11f01a49d4a7e4ee77cfbce395f5209 not found: ID does not exist" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.088013 4892 scope.go:117] "RemoveContainer" containerID="3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11" Oct 06 13:50:02 crc kubenswrapper[4892]: E1006 13:50:02.089807 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11\": container with ID starting with 3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11 not found: ID does not exist" containerID="3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.089836 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11"} err="failed to get container status \"3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11\": rpc error: code = NotFound desc = could not find container \"3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11\": container with ID starting with 3b455808c3d5f6ade783cd15a88a5f4f7695acb17d97788b0e606f89436edf11 not found: ID does not exist" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.089860 4892 scope.go:117] "RemoveContainer" containerID="5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399" Oct 06 13:50:02 crc kubenswrapper[4892]: E1006 13:50:02.090190 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399\": container with ID starting with 5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399 not found: ID does not exist" containerID="5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.090253 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399"} err="failed to get container status \"5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399\": rpc error: code = NotFound desc = could not find container \"5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399\": container with ID starting with 5e437a327b05c281af0a8bde72831752d8482ba94a1938710be9cb60921b8399 not found: ID does not exist" Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.092457 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k26ds"] Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.102788 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k26ds"] Oct 06 13:50:02 crc kubenswrapper[4892]: I1006 13:50:02.185102 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d28053a-e999-4f7d-887c-cb000b698544" path="/var/lib/kubelet/pods/3d28053a-e999-4f7d-887c-cb000b698544/volumes" Oct 06 13:50:07 crc kubenswrapper[4892]: I1006 13:50:07.169131 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:50:07 crc kubenswrapper[4892]: E1006 13:50:07.170193 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:50:20 crc kubenswrapper[4892]: I1006 13:50:20.169310 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:50:20 crc kubenswrapper[4892]: E1006 13:50:20.170485 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:50:35 crc kubenswrapper[4892]: I1006 13:50:35.168728 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:50:35 crc kubenswrapper[4892]: E1006 13:50:35.169759 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:50:46 crc kubenswrapper[4892]: I1006 13:50:46.170386 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:50:46 crc kubenswrapper[4892]: E1006 13:50:46.171721 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:50:57 crc kubenswrapper[4892]: I1006 13:50:57.169084 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:50:57 crc kubenswrapper[4892]: E1006 13:50:57.169805 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:51:12 crc kubenswrapper[4892]: I1006 13:51:12.170416 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:51:12 crc kubenswrapper[4892]: E1006 13:51:12.172903 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:51:23 crc kubenswrapper[4892]: I1006 13:51:23.168457 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:51:23 crc kubenswrapper[4892]: E1006 13:51:23.169122 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:51:36 crc kubenswrapper[4892]: I1006 13:51:36.169509 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:51:36 crc kubenswrapper[4892]: E1006 13:51:36.170438 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:51:47 crc kubenswrapper[4892]: I1006 13:51:47.169856 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:51:47 crc kubenswrapper[4892]: E1006 13:51:47.170615 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:52:02 crc kubenswrapper[4892]: I1006 13:52:02.169562 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:52:02 crc kubenswrapper[4892]: E1006 13:52:02.170453 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:52:16 crc kubenswrapper[4892]: I1006 13:52:16.168812 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:52:16 crc kubenswrapper[4892]: E1006 13:52:16.169813 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:52:30 crc kubenswrapper[4892]: I1006 13:52:30.169577 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:52:30 crc kubenswrapper[4892]: E1006 13:52:30.171089 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.467918 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dj7cv"] Oct 06 13:52:36 crc kubenswrapper[4892]: E1006 13:52:36.468941 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d28053a-e999-4f7d-887c-cb000b698544" containerName="extract-content" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.468958 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d28053a-e999-4f7d-887c-cb000b698544" containerName="extract-content" Oct 06 13:52:36 crc kubenswrapper[4892]: E1006 13:52:36.468966 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d28053a-e999-4f7d-887c-cb000b698544" containerName="registry-server" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.468973 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d28053a-e999-4f7d-887c-cb000b698544" containerName="registry-server" Oct 06 13:52:36 crc kubenswrapper[4892]: E1006 13:52:36.468995 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d28053a-e999-4f7d-887c-cb000b698544" containerName="extract-utilities" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.469001 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d28053a-e999-4f7d-887c-cb000b698544" containerName="extract-utilities" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.469196 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d28053a-e999-4f7d-887c-cb000b698544" containerName="registry-server" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.473697 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.486246 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dj7cv"] Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.618369 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-utilities\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.618519 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnrxc\" (UniqueName: \"kubernetes.io/projected/c96967de-a018-46cd-845a-1ccb2625fb73-kube-api-access-jnrxc\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.618575 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-catalog-content\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.720297 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-catalog-content\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.720776 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-utilities\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.720910 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-catalog-content\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.721074 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnrxc\" (UniqueName: \"kubernetes.io/projected/c96967de-a018-46cd-845a-1ccb2625fb73-kube-api-access-jnrxc\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.721218 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-utilities\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.743631 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnrxc\" (UniqueName: \"kubernetes.io/projected/c96967de-a018-46cd-845a-1ccb2625fb73-kube-api-access-jnrxc\") pod \"certified-operators-dj7cv\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:36 crc kubenswrapper[4892]: I1006 13:52:36.793188 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:37 crc kubenswrapper[4892]: I1006 13:52:37.262956 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dj7cv"] Oct 06 13:52:37 crc kubenswrapper[4892]: I1006 13:52:37.813797 4892 generic.go:334] "Generic (PLEG): container finished" podID="c96967de-a018-46cd-845a-1ccb2625fb73" containerID="7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06" exitCode=0 Oct 06 13:52:37 crc kubenswrapper[4892]: I1006 13:52:37.814049 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7cv" event={"ID":"c96967de-a018-46cd-845a-1ccb2625fb73","Type":"ContainerDied","Data":"7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06"} Oct 06 13:52:37 crc kubenswrapper[4892]: I1006 13:52:37.814081 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7cv" event={"ID":"c96967de-a018-46cd-845a-1ccb2625fb73","Type":"ContainerStarted","Data":"a30b23ec6a00aba24430dbe65d7d66dcd2b8e1df452ad5b72c887655d761baf1"} Oct 06 13:52:38 crc kubenswrapper[4892]: I1006 13:52:38.831017 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7cv" event={"ID":"c96967de-a018-46cd-845a-1ccb2625fb73","Type":"ContainerStarted","Data":"ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a"} Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.647349 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q65jw"] Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.650068 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.674501 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q65jw"] Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.787208 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-utilities\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.787608 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpf8p\" (UniqueName: \"kubernetes.io/projected/42f0fdd2-0948-4979-ab03-3a035bb682f2-kube-api-access-wpf8p\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.787634 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-catalog-content\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.841303 4892 generic.go:334] "Generic (PLEG): container finished" podID="c96967de-a018-46cd-845a-1ccb2625fb73" containerID="ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a" exitCode=0 Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.841362 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7cv" event={"ID":"c96967de-a018-46cd-845a-1ccb2625fb73","Type":"ContainerDied","Data":"ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a"} Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.889141 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-utilities\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.889236 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-catalog-content\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.889262 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpf8p\" (UniqueName: \"kubernetes.io/projected/42f0fdd2-0948-4979-ab03-3a035bb682f2-kube-api-access-wpf8p\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.889743 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-utilities\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.889765 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-catalog-content\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.912808 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpf8p\" (UniqueName: \"kubernetes.io/projected/42f0fdd2-0948-4979-ab03-3a035bb682f2-kube-api-access-wpf8p\") pod \"community-operators-q65jw\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:39 crc kubenswrapper[4892]: I1006 13:52:39.980529 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:40 crc kubenswrapper[4892]: I1006 13:52:40.538500 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q65jw"] Oct 06 13:52:40 crc kubenswrapper[4892]: W1006 13:52:40.551952 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42f0fdd2_0948_4979_ab03_3a035bb682f2.slice/crio-572d6b35fe76fb812d6fe339f2696479214aa4dde8664d63cd67784647a47d5a WatchSource:0}: Error finding container 572d6b35fe76fb812d6fe339f2696479214aa4dde8664d63cd67784647a47d5a: Status 404 returned error can't find the container with id 572d6b35fe76fb812d6fe339f2696479214aa4dde8664d63cd67784647a47d5a Oct 06 13:52:40 crc kubenswrapper[4892]: I1006 13:52:40.853229 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jw" event={"ID":"42f0fdd2-0948-4979-ab03-3a035bb682f2","Type":"ContainerStarted","Data":"572d6b35fe76fb812d6fe339f2696479214aa4dde8664d63cd67784647a47d5a"} Oct 06 13:52:41 crc kubenswrapper[4892]: I1006 13:52:41.168609 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:52:41 crc kubenswrapper[4892]: E1006 13:52:41.168975 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:52:41 crc kubenswrapper[4892]: I1006 13:52:41.870662 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jw" event={"ID":"42f0fdd2-0948-4979-ab03-3a035bb682f2","Type":"ContainerDied","Data":"d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f"} Oct 06 13:52:41 crc kubenswrapper[4892]: I1006 13:52:41.870455 4892 generic.go:334] "Generic (PLEG): container finished" podID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerID="d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f" exitCode=0 Oct 06 13:52:41 crc kubenswrapper[4892]: I1006 13:52:41.876096 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7cv" event={"ID":"c96967de-a018-46cd-845a-1ccb2625fb73","Type":"ContainerStarted","Data":"04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a"} Oct 06 13:52:41 crc kubenswrapper[4892]: I1006 13:52:41.924083 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dj7cv" podStartSLOduration=2.866823837 podStartE2EDuration="5.924065879s" podCreationTimestamp="2025-10-06 13:52:36 +0000 UTC" firstStartedPulling="2025-10-06 13:52:37.81588719 +0000 UTC m=+6244.365592955" lastFinishedPulling="2025-10-06 13:52:40.873129192 +0000 UTC m=+6247.422834997" observedRunningTime="2025-10-06 13:52:41.91615346 +0000 UTC m=+6248.465859235" watchObservedRunningTime="2025-10-06 13:52:41.924065879 +0000 UTC m=+6248.473771644" Oct 06 13:52:42 crc kubenswrapper[4892]: I1006 13:52:42.894543 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jw" event={"ID":"42f0fdd2-0948-4979-ab03-3a035bb682f2","Type":"ContainerStarted","Data":"5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8"} Oct 06 13:52:43 crc kubenswrapper[4892]: I1006 13:52:43.913549 4892 generic.go:334] "Generic (PLEG): container finished" podID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerID="5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8" exitCode=0 Oct 06 13:52:43 crc kubenswrapper[4892]: I1006 13:52:43.913638 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jw" event={"ID":"42f0fdd2-0948-4979-ab03-3a035bb682f2","Type":"ContainerDied","Data":"5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8"} Oct 06 13:52:44 crc kubenswrapper[4892]: I1006 13:52:44.928703 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jw" event={"ID":"42f0fdd2-0948-4979-ab03-3a035bb682f2","Type":"ContainerStarted","Data":"17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593"} Oct 06 13:52:44 crc kubenswrapper[4892]: I1006 13:52:44.952823 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q65jw" podStartSLOduration=3.449369078 podStartE2EDuration="5.952799739s" podCreationTimestamp="2025-10-06 13:52:39 +0000 UTC" firstStartedPulling="2025-10-06 13:52:41.873385385 +0000 UTC m=+6248.423091180" lastFinishedPulling="2025-10-06 13:52:44.376816056 +0000 UTC m=+6250.926521841" observedRunningTime="2025-10-06 13:52:44.951164272 +0000 UTC m=+6251.500870057" watchObservedRunningTime="2025-10-06 13:52:44.952799739 +0000 UTC m=+6251.502505514" Oct 06 13:52:46 crc kubenswrapper[4892]: I1006 13:52:46.794169 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:46 crc kubenswrapper[4892]: I1006 13:52:46.794635 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:46 crc kubenswrapper[4892]: I1006 13:52:46.858838 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:47 crc kubenswrapper[4892]: I1006 13:52:47.024928 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:48 crc kubenswrapper[4892]: I1006 13:52:48.045218 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dj7cv"] Oct 06 13:52:48 crc kubenswrapper[4892]: I1006 13:52:48.981371 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dj7cv" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" containerName="registry-server" containerID="cri-o://04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a" gracePeriod=2 Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.486547 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.596848 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-utilities\") pod \"c96967de-a018-46cd-845a-1ccb2625fb73\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.597057 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-catalog-content\") pod \"c96967de-a018-46cd-845a-1ccb2625fb73\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.597200 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnrxc\" (UniqueName: \"kubernetes.io/projected/c96967de-a018-46cd-845a-1ccb2625fb73-kube-api-access-jnrxc\") pod \"c96967de-a018-46cd-845a-1ccb2625fb73\" (UID: \"c96967de-a018-46cd-845a-1ccb2625fb73\") " Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.598102 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-utilities" (OuterVolumeSpecName: "utilities") pod "c96967de-a018-46cd-845a-1ccb2625fb73" (UID: "c96967de-a018-46cd-845a-1ccb2625fb73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.603245 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c96967de-a018-46cd-845a-1ccb2625fb73-kube-api-access-jnrxc" (OuterVolumeSpecName: "kube-api-access-jnrxc") pod "c96967de-a018-46cd-845a-1ccb2625fb73" (UID: "c96967de-a018-46cd-845a-1ccb2625fb73"). InnerVolumeSpecName "kube-api-access-jnrxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.651241 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c96967de-a018-46cd-845a-1ccb2625fb73" (UID: "c96967de-a018-46cd-845a-1ccb2625fb73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.700469 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnrxc\" (UniqueName: \"kubernetes.io/projected/c96967de-a018-46cd-845a-1ccb2625fb73-kube-api-access-jnrxc\") on node \"crc\" DevicePath \"\"" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.700508 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.700523 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c96967de-a018-46cd-845a-1ccb2625fb73-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.980861 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.980923 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.993249 4892 generic.go:334] "Generic (PLEG): container finished" podID="c96967de-a018-46cd-845a-1ccb2625fb73" containerID="04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a" exitCode=0 Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.993293 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7cv" event={"ID":"c96967de-a018-46cd-845a-1ccb2625fb73","Type":"ContainerDied","Data":"04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a"} Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.993303 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dj7cv" Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.993340 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dj7cv" event={"ID":"c96967de-a018-46cd-845a-1ccb2625fb73","Type":"ContainerDied","Data":"a30b23ec6a00aba24430dbe65d7d66dcd2b8e1df452ad5b72c887655d761baf1"} Oct 06 13:52:49 crc kubenswrapper[4892]: I1006 13:52:49.993359 4892 scope.go:117] "RemoveContainer" containerID="04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.018209 4892 scope.go:117] "RemoveContainer" containerID="ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.028348 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dj7cv"] Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.038159 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dj7cv"] Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.051649 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.052952 4892 scope.go:117] "RemoveContainer" containerID="7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.112396 4892 scope.go:117] "RemoveContainer" containerID="04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a" Oct 06 13:52:50 crc kubenswrapper[4892]: E1006 13:52:50.112856 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a\": container with ID starting with 04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a not found: ID does not exist" containerID="04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.112894 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a"} err="failed to get container status \"04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a\": rpc error: code = NotFound desc = could not find container \"04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a\": container with ID starting with 04be3a80f8db3a62a4fd0672d5d09bc1e1fce4ddfa8f35c0b913b2c7973b992a not found: ID does not exist" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.112919 4892 scope.go:117] "RemoveContainer" containerID="ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a" Oct 06 13:52:50 crc kubenswrapper[4892]: E1006 13:52:50.113284 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a\": container with ID starting with ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a not found: ID does not exist" containerID="ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.113334 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a"} err="failed to get container status \"ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a\": rpc error: code = NotFound desc = could not find container \"ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a\": container with ID starting with ddc0ddda7a91bf28949bf604d615a6738bb122a32e6e68fd2178801fd2eb8a0a not found: ID does not exist" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.113361 4892 scope.go:117] "RemoveContainer" containerID="7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06" Oct 06 13:52:50 crc kubenswrapper[4892]: E1006 13:52:50.113666 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06\": container with ID starting with 7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06 not found: ID does not exist" containerID="7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.113732 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06"} err="failed to get container status \"7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06\": rpc error: code = NotFound desc = could not find container \"7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06\": container with ID starting with 7b0d2bb1a76346f2e392260856d73d5a70b0f67dea461fe2c7536ee787105b06 not found: ID does not exist" Oct 06 13:52:50 crc kubenswrapper[4892]: I1006 13:52:50.181032 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" path="/var/lib/kubelet/pods/c96967de-a018-46cd-845a-1ccb2625fb73/volumes" Oct 06 13:52:51 crc kubenswrapper[4892]: I1006 13:52:51.084057 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:52 crc kubenswrapper[4892]: I1006 13:52:52.458939 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q65jw"] Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.038261 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q65jw" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerName="registry-server" containerID="cri-o://17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593" gracePeriod=2 Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.168577 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:52:53 crc kubenswrapper[4892]: E1006 13:52:53.168973 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.524701 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.687511 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-catalog-content\") pod \"42f0fdd2-0948-4979-ab03-3a035bb682f2\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.688639 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-utilities\") pod \"42f0fdd2-0948-4979-ab03-3a035bb682f2\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.688745 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpf8p\" (UniqueName: \"kubernetes.io/projected/42f0fdd2-0948-4979-ab03-3a035bb682f2-kube-api-access-wpf8p\") pod \"42f0fdd2-0948-4979-ab03-3a035bb682f2\" (UID: \"42f0fdd2-0948-4979-ab03-3a035bb682f2\") " Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.689909 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-utilities" (OuterVolumeSpecName: "utilities") pod "42f0fdd2-0948-4979-ab03-3a035bb682f2" (UID: "42f0fdd2-0948-4979-ab03-3a035bb682f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.695787 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42f0fdd2-0948-4979-ab03-3a035bb682f2-kube-api-access-wpf8p" (OuterVolumeSpecName: "kube-api-access-wpf8p") pod "42f0fdd2-0948-4979-ab03-3a035bb682f2" (UID: "42f0fdd2-0948-4979-ab03-3a035bb682f2"). InnerVolumeSpecName "kube-api-access-wpf8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.743033 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42f0fdd2-0948-4979-ab03-3a035bb682f2" (UID: "42f0fdd2-0948-4979-ab03-3a035bb682f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.791741 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.791792 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42f0fdd2-0948-4979-ab03-3a035bb682f2-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:52:53 crc kubenswrapper[4892]: I1006 13:52:53.791811 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpf8p\" (UniqueName: \"kubernetes.io/projected/42f0fdd2-0948-4979-ab03-3a035bb682f2-kube-api-access-wpf8p\") on node \"crc\" DevicePath \"\"" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.064102 4892 generic.go:334] "Generic (PLEG): container finished" podID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerID="17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593" exitCode=0 Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.064162 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jw" event={"ID":"42f0fdd2-0948-4979-ab03-3a035bb682f2","Type":"ContainerDied","Data":"17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593"} Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.064213 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q65jw" event={"ID":"42f0fdd2-0948-4979-ab03-3a035bb682f2","Type":"ContainerDied","Data":"572d6b35fe76fb812d6fe339f2696479214aa4dde8664d63cd67784647a47d5a"} Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.064253 4892 scope.go:117] "RemoveContainer" containerID="17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.064271 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q65jw" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.100707 4892 scope.go:117] "RemoveContainer" containerID="5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.103452 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q65jw"] Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.111334 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q65jw"] Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.134079 4892 scope.go:117] "RemoveContainer" containerID="d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.179146 4892 scope.go:117] "RemoveContainer" containerID="17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593" Oct 06 13:52:54 crc kubenswrapper[4892]: E1006 13:52:54.179654 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593\": container with ID starting with 17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593 not found: ID does not exist" containerID="17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.179703 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593"} err="failed to get container status \"17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593\": rpc error: code = NotFound desc = could not find container \"17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593\": container with ID starting with 17e718cd4fccb33b1b3e1363cf26db040bba41efe4b07aeca5cd6195270c1593 not found: ID does not exist" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.179733 4892 scope.go:117] "RemoveContainer" containerID="5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8" Oct 06 13:52:54 crc kubenswrapper[4892]: E1006 13:52:54.181940 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8\": container with ID starting with 5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8 not found: ID does not exist" containerID="5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.181989 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8"} err="failed to get container status \"5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8\": rpc error: code = NotFound desc = could not find container \"5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8\": container with ID starting with 5a4470cad57bdcb137ed7f1d7e29e574c33c2012a0236fecd177f9942e9992d8 not found: ID does not exist" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.182018 4892 scope.go:117] "RemoveContainer" containerID="d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f" Oct 06 13:52:54 crc kubenswrapper[4892]: E1006 13:52:54.182455 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f\": container with ID starting with d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f not found: ID does not exist" containerID="d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.182485 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f"} err="failed to get container status \"d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f\": rpc error: code = NotFound desc = could not find container \"d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f\": container with ID starting with d3ba2079d88b711016c0514bbe7c98763c5e98fcf51102742fb5bd1cc363c81f not found: ID does not exist" Oct 06 13:52:54 crc kubenswrapper[4892]: I1006 13:52:54.182652 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" path="/var/lib/kubelet/pods/42f0fdd2-0948-4979-ab03-3a035bb682f2/volumes" Oct 06 13:53:06 crc kubenswrapper[4892]: I1006 13:53:06.169197 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:53:06 crc kubenswrapper[4892]: E1006 13:53:06.171314 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:53:21 crc kubenswrapper[4892]: I1006 13:53:21.170007 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:53:21 crc kubenswrapper[4892]: E1006 13:53:21.171343 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:53:36 crc kubenswrapper[4892]: I1006 13:53:36.169174 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:53:36 crc kubenswrapper[4892]: E1006 13:53:36.170006 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:53:51 crc kubenswrapper[4892]: I1006 13:53:51.168550 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:53:51 crc kubenswrapper[4892]: E1006 13:53:51.169351 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 13:54:03 crc kubenswrapper[4892]: I1006 13:54:03.168300 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:54:03 crc kubenswrapper[4892]: I1006 13:54:03.829456 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"f6f829be6c18075b1c100101b78324c22f60a5bfcc75210c4e3f36acc90430cc"} Oct 06 13:55:59 crc kubenswrapper[4892]: I1006 13:55:59.153129 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-57bbd8d677-4mwpb" podUID="9b16ec0c-fdde-42a8-9a45-da67ecd56360" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 06 13:56:22 crc kubenswrapper[4892]: I1006 13:56:22.984782 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:56:22 crc kubenswrapper[4892]: I1006 13:56:22.985530 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:56:52 crc kubenswrapper[4892]: I1006 13:56:52.984544 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:56:52 crc kubenswrapper[4892]: I1006 13:56:52.985121 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:57:22 crc kubenswrapper[4892]: I1006 13:57:22.984822 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:57:22 crc kubenswrapper[4892]: I1006 13:57:22.985845 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 13:57:22 crc kubenswrapper[4892]: I1006 13:57:22.985930 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 13:57:22 crc kubenswrapper[4892]: I1006 13:57:22.987408 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6f829be6c18075b1c100101b78324c22f60a5bfcc75210c4e3f36acc90430cc"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 13:57:22 crc kubenswrapper[4892]: I1006 13:57:22.987554 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://f6f829be6c18075b1c100101b78324c22f60a5bfcc75210c4e3f36acc90430cc" gracePeriod=600 Oct 06 13:57:24 crc kubenswrapper[4892]: I1006 13:57:24.087848 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="f6f829be6c18075b1c100101b78324c22f60a5bfcc75210c4e3f36acc90430cc" exitCode=0 Oct 06 13:57:24 crc kubenswrapper[4892]: I1006 13:57:24.087923 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"f6f829be6c18075b1c100101b78324c22f60a5bfcc75210c4e3f36acc90430cc"} Oct 06 13:57:24 crc kubenswrapper[4892]: I1006 13:57:24.088448 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948"} Oct 06 13:57:24 crc kubenswrapper[4892]: I1006 13:57:24.088481 4892 scope.go:117] "RemoveContainer" containerID="fbc4fbb3aa5c8b228848ca77fc78fbdd986cd8957b0b6ae65e35f9569179c20e" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.161371 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wwjvl"] Oct 06 13:58:31 crc kubenswrapper[4892]: E1006 13:58:31.162918 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerName="extract-utilities" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.162950 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerName="extract-utilities" Oct 06 13:58:31 crc kubenswrapper[4892]: E1006 13:58:31.163001 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" containerName="extract-utilities" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.163018 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" containerName="extract-utilities" Oct 06 13:58:31 crc kubenswrapper[4892]: E1006 13:58:31.163056 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerName="extract-content" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.163074 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerName="extract-content" Oct 06 13:58:31 crc kubenswrapper[4892]: E1006 13:58:31.163127 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerName="registry-server" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.163142 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerName="registry-server" Oct 06 13:58:31 crc kubenswrapper[4892]: E1006 13:58:31.163188 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" containerName="registry-server" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.163204 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" containerName="registry-server" Oct 06 13:58:31 crc kubenswrapper[4892]: E1006 13:58:31.163238 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" containerName="extract-content" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.163253 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" containerName="extract-content" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.163732 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="c96967de-a018-46cd-845a-1ccb2625fb73" containerName="registry-server" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.163837 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="42f0fdd2-0948-4979-ab03-3a035bb682f2" containerName="registry-server" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.167381 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.197869 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wwjvl"] Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.265809 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-utilities\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.265916 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5h4\" (UniqueName: \"kubernetes.io/projected/0a776c1e-295e-4cae-b78f-c5eaae257210-kube-api-access-4d5h4\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.265944 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-catalog-content\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.368426 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-utilities\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.368518 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5h4\" (UniqueName: \"kubernetes.io/projected/0a776c1e-295e-4cae-b78f-c5eaae257210-kube-api-access-4d5h4\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.368544 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-catalog-content\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.369016 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-utilities\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.369042 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-catalog-content\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.397111 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5h4\" (UniqueName: \"kubernetes.io/projected/0a776c1e-295e-4cae-b78f-c5eaae257210-kube-api-access-4d5h4\") pod \"redhat-operators-wwjvl\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.493366 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:31 crc kubenswrapper[4892]: I1006 13:58:31.984314 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wwjvl"] Oct 06 13:58:32 crc kubenswrapper[4892]: I1006 13:58:32.779512 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerID="72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6" exitCode=0 Oct 06 13:58:32 crc kubenswrapper[4892]: I1006 13:58:32.779568 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjvl" event={"ID":"0a776c1e-295e-4cae-b78f-c5eaae257210","Type":"ContainerDied","Data":"72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6"} Oct 06 13:58:32 crc kubenswrapper[4892]: I1006 13:58:32.779635 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjvl" event={"ID":"0a776c1e-295e-4cae-b78f-c5eaae257210","Type":"ContainerStarted","Data":"4f7043d01b99d54af4066a7a0483ae9b4082426893306c58a08a42b8ea0ddc39"} Oct 06 13:58:32 crc kubenswrapper[4892]: I1006 13:58:32.782654 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 13:58:34 crc kubenswrapper[4892]: I1006 13:58:34.798367 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjvl" event={"ID":"0a776c1e-295e-4cae-b78f-c5eaae257210","Type":"ContainerStarted","Data":"b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103"} Oct 06 13:58:35 crc kubenswrapper[4892]: I1006 13:58:35.809146 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerID="b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103" exitCode=0 Oct 06 13:58:35 crc kubenswrapper[4892]: I1006 13:58:35.809192 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjvl" event={"ID":"0a776c1e-295e-4cae-b78f-c5eaae257210","Type":"ContainerDied","Data":"b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103"} Oct 06 13:58:36 crc kubenswrapper[4892]: I1006 13:58:36.826793 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjvl" event={"ID":"0a776c1e-295e-4cae-b78f-c5eaae257210","Type":"ContainerStarted","Data":"42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe"} Oct 06 13:58:36 crc kubenswrapper[4892]: I1006 13:58:36.853762 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wwjvl" podStartSLOduration=2.352250421 podStartE2EDuration="5.853733451s" podCreationTimestamp="2025-10-06 13:58:31 +0000 UTC" firstStartedPulling="2025-10-06 13:58:32.782283982 +0000 UTC m=+6599.331989747" lastFinishedPulling="2025-10-06 13:58:36.283767012 +0000 UTC m=+6602.833472777" observedRunningTime="2025-10-06 13:58:36.842880318 +0000 UTC m=+6603.392586133" watchObservedRunningTime="2025-10-06 13:58:36.853733451 +0000 UTC m=+6603.403439246" Oct 06 13:58:41 crc kubenswrapper[4892]: I1006 13:58:41.493749 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:41 crc kubenswrapper[4892]: I1006 13:58:41.494280 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:41 crc kubenswrapper[4892]: I1006 13:58:41.571559 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:41 crc kubenswrapper[4892]: I1006 13:58:41.961709 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:42 crc kubenswrapper[4892]: I1006 13:58:42.045277 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wwjvl"] Oct 06 13:58:43 crc kubenswrapper[4892]: I1006 13:58:43.899541 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wwjvl" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerName="registry-server" containerID="cri-o://42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe" gracePeriod=2 Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.387295 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.447080 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d5h4\" (UniqueName: \"kubernetes.io/projected/0a776c1e-295e-4cae-b78f-c5eaae257210-kube-api-access-4d5h4\") pod \"0a776c1e-295e-4cae-b78f-c5eaae257210\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.447187 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-utilities\") pod \"0a776c1e-295e-4cae-b78f-c5eaae257210\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.447373 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-catalog-content\") pod \"0a776c1e-295e-4cae-b78f-c5eaae257210\" (UID: \"0a776c1e-295e-4cae-b78f-c5eaae257210\") " Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.448478 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-utilities" (OuterVolumeSpecName: "utilities") pod "0a776c1e-295e-4cae-b78f-c5eaae257210" (UID: "0a776c1e-295e-4cae-b78f-c5eaae257210"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.453021 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a776c1e-295e-4cae-b78f-c5eaae257210-kube-api-access-4d5h4" (OuterVolumeSpecName: "kube-api-access-4d5h4") pod "0a776c1e-295e-4cae-b78f-c5eaae257210" (UID: "0a776c1e-295e-4cae-b78f-c5eaae257210"). InnerVolumeSpecName "kube-api-access-4d5h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.526774 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a776c1e-295e-4cae-b78f-c5eaae257210" (UID: "0a776c1e-295e-4cae-b78f-c5eaae257210"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.550849 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d5h4\" (UniqueName: \"kubernetes.io/projected/0a776c1e-295e-4cae-b78f-c5eaae257210-kube-api-access-4d5h4\") on node \"crc\" DevicePath \"\"" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.550928 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.550943 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a776c1e-295e-4cae-b78f-c5eaae257210-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.918802 4892 generic.go:334] "Generic (PLEG): container finished" podID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerID="42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe" exitCode=0 Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.918874 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjvl" event={"ID":"0a776c1e-295e-4cae-b78f-c5eaae257210","Type":"ContainerDied","Data":"42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe"} Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.918918 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wwjvl" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.918955 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wwjvl" event={"ID":"0a776c1e-295e-4cae-b78f-c5eaae257210","Type":"ContainerDied","Data":"4f7043d01b99d54af4066a7a0483ae9b4082426893306c58a08a42b8ea0ddc39"} Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.918989 4892 scope.go:117] "RemoveContainer" containerID="42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.978656 4892 scope.go:117] "RemoveContainer" containerID="b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103" Oct 06 13:58:44 crc kubenswrapper[4892]: I1006 13:58:44.998156 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wwjvl"] Oct 06 13:58:45 crc kubenswrapper[4892]: I1006 13:58:45.032549 4892 scope.go:117] "RemoveContainer" containerID="72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6" Oct 06 13:58:45 crc kubenswrapper[4892]: I1006 13:58:45.050307 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wwjvl"] Oct 06 13:58:45 crc kubenswrapper[4892]: I1006 13:58:45.091196 4892 scope.go:117] "RemoveContainer" containerID="42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe" Oct 06 13:58:45 crc kubenswrapper[4892]: E1006 13:58:45.091684 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe\": container with ID starting with 42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe not found: ID does not exist" containerID="42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe" Oct 06 13:58:45 crc kubenswrapper[4892]: I1006 13:58:45.091735 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe"} err="failed to get container status \"42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe\": rpc error: code = NotFound desc = could not find container \"42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe\": container with ID starting with 42ce538ae56d470a1457d7ae392cdee0b03e45367f3ec89e4b7a60a9685fd3fe not found: ID does not exist" Oct 06 13:58:45 crc kubenswrapper[4892]: I1006 13:58:45.091762 4892 scope.go:117] "RemoveContainer" containerID="b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103" Oct 06 13:58:45 crc kubenswrapper[4892]: E1006 13:58:45.092116 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103\": container with ID starting with b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103 not found: ID does not exist" containerID="b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103" Oct 06 13:58:45 crc kubenswrapper[4892]: I1006 13:58:45.092183 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103"} err="failed to get container status \"b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103\": rpc error: code = NotFound desc = could not find container \"b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103\": container with ID starting with b8e8f8df5587da53df09ad6f786420f13c2e44e04da86ffe69abeeb67e639103 not found: ID does not exist" Oct 06 13:58:45 crc kubenswrapper[4892]: I1006 13:58:45.092228 4892 scope.go:117] "RemoveContainer" containerID="72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6" Oct 06 13:58:45 crc kubenswrapper[4892]: E1006 13:58:45.092719 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6\": container with ID starting with 72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6 not found: ID does not exist" containerID="72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6" Oct 06 13:58:45 crc kubenswrapper[4892]: I1006 13:58:45.092770 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6"} err="failed to get container status \"72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6\": rpc error: code = NotFound desc = could not find container \"72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6\": container with ID starting with 72573a59d67b380824dcf24a7201b9d3f6f81e4ada683b4d493bed0909e321f6 not found: ID does not exist" Oct 06 13:58:46 crc kubenswrapper[4892]: I1006 13:58:46.186764 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" path="/var/lib/kubelet/pods/0a776c1e-295e-4cae-b78f-c5eaae257210/volumes" Oct 06 13:59:52 crc kubenswrapper[4892]: I1006 13:59:52.984375 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 13:59:52 crc kubenswrapper[4892]: I1006 13:59:52.985382 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.181912 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb"] Oct 06 14:00:00 crc kubenswrapper[4892]: E1006 14:00:00.182769 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.182784 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4892]: E1006 14:00:00.182819 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerName="extract-content" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.182824 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerName="extract-content" Oct 06 14:00:00 crc kubenswrapper[4892]: E1006 14:00:00.182842 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerName="extract-utilities" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.182847 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerName="extract-utilities" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.183036 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a776c1e-295e-4cae-b78f-c5eaae257210" containerName="registry-server" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.183725 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.186804 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.187615 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.191403 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb"] Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.332719 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f2a163a-f8f9-433b-a399-18d4e5e726b6-config-volume\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.332817 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f2a163a-f8f9-433b-a399-18d4e5e726b6-secret-volume\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.332840 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxmbm\" (UniqueName: \"kubernetes.io/projected/1f2a163a-f8f9-433b-a399-18d4e5e726b6-kube-api-access-nxmbm\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.435468 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f2a163a-f8f9-433b-a399-18d4e5e726b6-secret-volume\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.435520 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxmbm\" (UniqueName: \"kubernetes.io/projected/1f2a163a-f8f9-433b-a399-18d4e5e726b6-kube-api-access-nxmbm\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.435657 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f2a163a-f8f9-433b-a399-18d4e5e726b6-config-volume\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.436781 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f2a163a-f8f9-433b-a399-18d4e5e726b6-config-volume\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.442497 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f2a163a-f8f9-433b-a399-18d4e5e726b6-secret-volume\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.453031 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxmbm\" (UniqueName: \"kubernetes.io/projected/1f2a163a-f8f9-433b-a399-18d4e5e726b6-kube-api-access-nxmbm\") pod \"collect-profiles-29329320-rffdb\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.514439 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:00 crc kubenswrapper[4892]: I1006 14:00:00.994744 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb"] Oct 06 14:00:01 crc kubenswrapper[4892]: I1006 14:00:01.744698 4892 generic.go:334] "Generic (PLEG): container finished" podID="1f2a163a-f8f9-433b-a399-18d4e5e726b6" containerID="6319d06543e687c5039ffc8e371e01fcc69f89b1e8dee74369932d41c9899fd4" exitCode=0 Oct 06 14:00:01 crc kubenswrapper[4892]: I1006 14:00:01.744766 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" event={"ID":"1f2a163a-f8f9-433b-a399-18d4e5e726b6","Type":"ContainerDied","Data":"6319d06543e687c5039ffc8e371e01fcc69f89b1e8dee74369932d41c9899fd4"} Oct 06 14:00:01 crc kubenswrapper[4892]: I1006 14:00:01.745020 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" event={"ID":"1f2a163a-f8f9-433b-a399-18d4e5e726b6","Type":"ContainerStarted","Data":"3aaf5812dc60710140145826dc98b2c6803a3e988f4a0707e8ab3cab6a5db1fb"} Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.093382 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.202665 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxmbm\" (UniqueName: \"kubernetes.io/projected/1f2a163a-f8f9-433b-a399-18d4e5e726b6-kube-api-access-nxmbm\") pod \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.202791 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f2a163a-f8f9-433b-a399-18d4e5e726b6-secret-volume\") pod \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.202848 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f2a163a-f8f9-433b-a399-18d4e5e726b6-config-volume\") pod \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\" (UID: \"1f2a163a-f8f9-433b-a399-18d4e5e726b6\") " Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.203772 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f2a163a-f8f9-433b-a399-18d4e5e726b6-config-volume" (OuterVolumeSpecName: "config-volume") pod "1f2a163a-f8f9-433b-a399-18d4e5e726b6" (UID: "1f2a163a-f8f9-433b-a399-18d4e5e726b6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.204377 4892 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f2a163a-f8f9-433b-a399-18d4e5e726b6-config-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.208504 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f2a163a-f8f9-433b-a399-18d4e5e726b6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1f2a163a-f8f9-433b-a399-18d4e5e726b6" (UID: "1f2a163a-f8f9-433b-a399-18d4e5e726b6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.209043 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2a163a-f8f9-433b-a399-18d4e5e726b6-kube-api-access-nxmbm" (OuterVolumeSpecName: "kube-api-access-nxmbm") pod "1f2a163a-f8f9-433b-a399-18d4e5e726b6" (UID: "1f2a163a-f8f9-433b-a399-18d4e5e726b6"). InnerVolumeSpecName "kube-api-access-nxmbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.305998 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxmbm\" (UniqueName: \"kubernetes.io/projected/1f2a163a-f8f9-433b-a399-18d4e5e726b6-kube-api-access-nxmbm\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.306981 4892 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f2a163a-f8f9-433b-a399-18d4e5e726b6-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.766725 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" event={"ID":"1f2a163a-f8f9-433b-a399-18d4e5e726b6","Type":"ContainerDied","Data":"3aaf5812dc60710140145826dc98b2c6803a3e988f4a0707e8ab3cab6a5db1fb"} Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.767269 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3aaf5812dc60710140145826dc98b2c6803a3e988f4a0707e8ab3cab6a5db1fb" Oct 06 14:00:03 crc kubenswrapper[4892]: I1006 14:00:03.766801 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29329320-rffdb" Oct 06 14:00:04 crc kubenswrapper[4892]: I1006 14:00:04.163038 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk"] Oct 06 14:00:04 crc kubenswrapper[4892]: I1006 14:00:04.180847 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29329275-mrxlk"] Oct 06 14:00:06 crc kubenswrapper[4892]: I1006 14:00:06.181589 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56eb2a3f-a505-4016-9e9d-9fadd306f540" path="/var/lib/kubelet/pods/56eb2a3f-a505-4016-9e9d-9fadd306f540/volumes" Oct 06 14:00:22 crc kubenswrapper[4892]: I1006 14:00:22.984237 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:00:22 crc kubenswrapper[4892]: I1006 14:00:22.984899 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:00:52 crc kubenswrapper[4892]: I1006 14:00:52.984113 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:00:52 crc kubenswrapper[4892]: I1006 14:00:52.984706 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:00:52 crc kubenswrapper[4892]: I1006 14:00:52.984753 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 14:00:52 crc kubenswrapper[4892]: I1006 14:00:52.985585 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:00:52 crc kubenswrapper[4892]: I1006 14:00:52.985643 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" gracePeriod=600 Oct 06 14:00:53 crc kubenswrapper[4892]: E1006 14:00:53.129564 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:00:53 crc kubenswrapper[4892]: I1006 14:00:53.310823 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" exitCode=0 Oct 06 14:00:53 crc kubenswrapper[4892]: I1006 14:00:53.310873 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948"} Oct 06 14:00:53 crc kubenswrapper[4892]: I1006 14:00:53.310941 4892 scope.go:117] "RemoveContainer" containerID="f6f829be6c18075b1c100101b78324c22f60a5bfcc75210c4e3f36acc90430cc" Oct 06 14:00:53 crc kubenswrapper[4892]: I1006 14:00:53.311752 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:00:53 crc kubenswrapper[4892]: E1006 14:00:53.312251 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.150967 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29329321-qw7ls"] Oct 06 14:01:00 crc kubenswrapper[4892]: E1006 14:01:00.152899 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2a163a-f8f9-433b-a399-18d4e5e726b6" containerName="collect-profiles" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.152985 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2a163a-f8f9-433b-a399-18d4e5e726b6" containerName="collect-profiles" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.153396 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2a163a-f8f9-433b-a399-18d4e5e726b6" containerName="collect-profiles" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.154207 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.168186 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329321-qw7ls"] Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.241851 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-combined-ca-bundle\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.242021 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-fernet-keys\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.242218 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-config-data\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.242270 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpntv\" (UniqueName: \"kubernetes.io/projected/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-kube-api-access-fpntv\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.344676 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-combined-ca-bundle\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.344771 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-fernet-keys\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.344846 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-config-data\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.344879 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpntv\" (UniqueName: \"kubernetes.io/projected/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-kube-api-access-fpntv\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.352295 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-combined-ca-bundle\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.352609 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-fernet-keys\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.354788 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-config-data\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.364945 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpntv\" (UniqueName: \"kubernetes.io/projected/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-kube-api-access-fpntv\") pod \"keystone-cron-29329321-qw7ls\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:00 crc kubenswrapper[4892]: I1006 14:01:00.519373 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:01 crc kubenswrapper[4892]: I1006 14:01:01.016210 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29329321-qw7ls"] Oct 06 14:01:01 crc kubenswrapper[4892]: I1006 14:01:01.400999 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329321-qw7ls" event={"ID":"4e85ae2a-90b8-44f1-9e60-38580ae44fb0","Type":"ContainerStarted","Data":"ade879489a1706627510442c6f02266b205bba069f2230bffbe22971de8d95b3"} Oct 06 14:01:01 crc kubenswrapper[4892]: I1006 14:01:01.401191 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329321-qw7ls" event={"ID":"4e85ae2a-90b8-44f1-9e60-38580ae44fb0","Type":"ContainerStarted","Data":"2f8d1711e76fcc5dd749905d460ce47bcc292034ab9be4a140368c971e01989e"} Oct 06 14:01:01 crc kubenswrapper[4892]: I1006 14:01:01.416845 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29329321-qw7ls" podStartSLOduration=1.416823293 podStartE2EDuration="1.416823293s" podCreationTimestamp="2025-10-06 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:01:01.413908599 +0000 UTC m=+6747.963614364" watchObservedRunningTime="2025-10-06 14:01:01.416823293 +0000 UTC m=+6747.966529058" Oct 06 14:01:04 crc kubenswrapper[4892]: I1006 14:01:04.182022 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:01:04 crc kubenswrapper[4892]: E1006 14:01:04.182621 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:01:04 crc kubenswrapper[4892]: I1006 14:01:04.803945 4892 scope.go:117] "RemoveContainer" containerID="05510eab11c4a7352b0a8bce1de5e685634e8042913fc492eda8f8c3148d0e53" Oct 06 14:01:05 crc kubenswrapper[4892]: I1006 14:01:05.446458 4892 generic.go:334] "Generic (PLEG): container finished" podID="4e85ae2a-90b8-44f1-9e60-38580ae44fb0" containerID="ade879489a1706627510442c6f02266b205bba069f2230bffbe22971de8d95b3" exitCode=0 Oct 06 14:01:05 crc kubenswrapper[4892]: I1006 14:01:05.446539 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329321-qw7ls" event={"ID":"4e85ae2a-90b8-44f1-9e60-38580ae44fb0","Type":"ContainerDied","Data":"ade879489a1706627510442c6f02266b205bba069f2230bffbe22971de8d95b3"} Oct 06 14:01:06 crc kubenswrapper[4892]: I1006 14:01:06.819661 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:06 crc kubenswrapper[4892]: I1006 14:01:06.967928 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-config-data\") pod \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " Oct 06 14:01:06 crc kubenswrapper[4892]: I1006 14:01:06.968273 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-combined-ca-bundle\") pod \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " Oct 06 14:01:06 crc kubenswrapper[4892]: I1006 14:01:06.968337 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpntv\" (UniqueName: \"kubernetes.io/projected/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-kube-api-access-fpntv\") pod \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " Oct 06 14:01:06 crc kubenswrapper[4892]: I1006 14:01:06.968423 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-fernet-keys\") pod \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\" (UID: \"4e85ae2a-90b8-44f1-9e60-38580ae44fb0\") " Oct 06 14:01:06 crc kubenswrapper[4892]: I1006 14:01:06.973600 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4e85ae2a-90b8-44f1-9e60-38580ae44fb0" (UID: "4e85ae2a-90b8-44f1-9e60-38580ae44fb0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:01:06 crc kubenswrapper[4892]: I1006 14:01:06.976792 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-kube-api-access-fpntv" (OuterVolumeSpecName: "kube-api-access-fpntv") pod "4e85ae2a-90b8-44f1-9e60-38580ae44fb0" (UID: "4e85ae2a-90b8-44f1-9e60-38580ae44fb0"). InnerVolumeSpecName "kube-api-access-fpntv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:01:06 crc kubenswrapper[4892]: I1006 14:01:06.995295 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e85ae2a-90b8-44f1-9e60-38580ae44fb0" (UID: "4e85ae2a-90b8-44f1-9e60-38580ae44fb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:01:07 crc kubenswrapper[4892]: I1006 14:01:07.034592 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-config-data" (OuterVolumeSpecName: "config-data") pod "4e85ae2a-90b8-44f1-9e60-38580ae44fb0" (UID: "4e85ae2a-90b8-44f1-9e60-38580ae44fb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 06 14:01:07 crc kubenswrapper[4892]: I1006 14:01:07.071396 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpntv\" (UniqueName: \"kubernetes.io/projected/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-kube-api-access-fpntv\") on node \"crc\" DevicePath \"\"" Oct 06 14:01:07 crc kubenswrapper[4892]: I1006 14:01:07.071430 4892 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 06 14:01:07 crc kubenswrapper[4892]: I1006 14:01:07.071441 4892 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 06 14:01:07 crc kubenswrapper[4892]: I1006 14:01:07.071448 4892 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e85ae2a-90b8-44f1-9e60-38580ae44fb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 06 14:01:07 crc kubenswrapper[4892]: I1006 14:01:07.467280 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29329321-qw7ls" event={"ID":"4e85ae2a-90b8-44f1-9e60-38580ae44fb0","Type":"ContainerDied","Data":"2f8d1711e76fcc5dd749905d460ce47bcc292034ab9be4a140368c971e01989e"} Oct 06 14:01:07 crc kubenswrapper[4892]: I1006 14:01:07.467359 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8d1711e76fcc5dd749905d460ce47bcc292034ab9be4a140368c971e01989e" Oct 06 14:01:07 crc kubenswrapper[4892]: I1006 14:01:07.467370 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29329321-qw7ls" Oct 06 14:01:19 crc kubenswrapper[4892]: I1006 14:01:19.169101 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:01:19 crc kubenswrapper[4892]: E1006 14:01:19.170293 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:01:32 crc kubenswrapper[4892]: I1006 14:01:32.171000 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:01:32 crc kubenswrapper[4892]: E1006 14:01:32.172565 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:01:43 crc kubenswrapper[4892]: I1006 14:01:43.168806 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:01:43 crc kubenswrapper[4892]: E1006 14:01:43.169707 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:01:56 crc kubenswrapper[4892]: I1006 14:01:56.168918 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:01:56 crc kubenswrapper[4892]: E1006 14:01:56.170110 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:02:10 crc kubenswrapper[4892]: I1006 14:02:10.169389 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:02:10 crc kubenswrapper[4892]: E1006 14:02:10.170455 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.570463 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ktz9n/must-gather-rthjm"] Oct 06 14:02:21 crc kubenswrapper[4892]: E1006 14:02:21.571299 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e85ae2a-90b8-44f1-9e60-38580ae44fb0" containerName="keystone-cron" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.571311 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e85ae2a-90b8-44f1-9e60-38580ae44fb0" containerName="keystone-cron" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.571510 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e85ae2a-90b8-44f1-9e60-38580ae44fb0" containerName="keystone-cron" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.575442 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.587727 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ktz9n"/"openshift-service-ca.crt" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.589057 4892 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ktz9n"/"kube-root-ca.crt" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.589136 4892 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ktz9n"/"default-dockercfg-jvszw" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.593712 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ktz9n/must-gather-rthjm"] Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.646468 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k444g\" (UniqueName: \"kubernetes.io/projected/0e5e4441-6007-4f81-8b18-679b18dd08f0-kube-api-access-k444g\") pod \"must-gather-rthjm\" (UID: \"0e5e4441-6007-4f81-8b18-679b18dd08f0\") " pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.646543 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e5e4441-6007-4f81-8b18-679b18dd08f0-must-gather-output\") pod \"must-gather-rthjm\" (UID: \"0e5e4441-6007-4f81-8b18-679b18dd08f0\") " pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.748183 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k444g\" (UniqueName: \"kubernetes.io/projected/0e5e4441-6007-4f81-8b18-679b18dd08f0-kube-api-access-k444g\") pod \"must-gather-rthjm\" (UID: \"0e5e4441-6007-4f81-8b18-679b18dd08f0\") " pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.748252 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e5e4441-6007-4f81-8b18-679b18dd08f0-must-gather-output\") pod \"must-gather-rthjm\" (UID: \"0e5e4441-6007-4f81-8b18-679b18dd08f0\") " pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.748802 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e5e4441-6007-4f81-8b18-679b18dd08f0-must-gather-output\") pod \"must-gather-rthjm\" (UID: \"0e5e4441-6007-4f81-8b18-679b18dd08f0\") " pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.787082 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k444g\" (UniqueName: \"kubernetes.io/projected/0e5e4441-6007-4f81-8b18-679b18dd08f0-kube-api-access-k444g\") pod \"must-gather-rthjm\" (UID: \"0e5e4441-6007-4f81-8b18-679b18dd08f0\") " pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:02:21 crc kubenswrapper[4892]: I1006 14:02:21.904520 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:02:22 crc kubenswrapper[4892]: I1006 14:02:22.170478 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:02:22 crc kubenswrapper[4892]: E1006 14:02:22.171032 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:02:22 crc kubenswrapper[4892]: I1006 14:02:22.369165 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ktz9n/must-gather-rthjm"] Oct 06 14:02:22 crc kubenswrapper[4892]: W1006 14:02:22.374890 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e5e4441_6007_4f81_8b18_679b18dd08f0.slice/crio-8343447d1c640c50e3b3a9460c70ed7d6ed594bca5edc2f62729f4fe71ac0cf0 WatchSource:0}: Error finding container 8343447d1c640c50e3b3a9460c70ed7d6ed594bca5edc2f62729f4fe71ac0cf0: Status 404 returned error can't find the container with id 8343447d1c640c50e3b3a9460c70ed7d6ed594bca5edc2f62729f4fe71ac0cf0 Oct 06 14:02:23 crc kubenswrapper[4892]: I1006 14:02:23.396250 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/must-gather-rthjm" event={"ID":"0e5e4441-6007-4f81-8b18-679b18dd08f0","Type":"ContainerStarted","Data":"8343447d1c640c50e3b3a9460c70ed7d6ed594bca5edc2f62729f4fe71ac0cf0"} Oct 06 14:02:30 crc kubenswrapper[4892]: I1006 14:02:30.500375 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/must-gather-rthjm" event={"ID":"0e5e4441-6007-4f81-8b18-679b18dd08f0","Type":"ContainerStarted","Data":"1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055"} Oct 06 14:02:31 crc kubenswrapper[4892]: I1006 14:02:31.510707 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/must-gather-rthjm" event={"ID":"0e5e4441-6007-4f81-8b18-679b18dd08f0","Type":"ContainerStarted","Data":"81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635"} Oct 06 14:02:31 crc kubenswrapper[4892]: I1006 14:02:31.528032 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ktz9n/must-gather-rthjm" podStartSLOduration=2.812137188 podStartE2EDuration="10.528016376s" podCreationTimestamp="2025-10-06 14:02:21 +0000 UTC" firstStartedPulling="2025-10-06 14:02:22.380399634 +0000 UTC m=+6828.930105399" lastFinishedPulling="2025-10-06 14:02:30.096278822 +0000 UTC m=+6836.645984587" observedRunningTime="2025-10-06 14:02:31.524507085 +0000 UTC m=+6838.074212850" watchObservedRunningTime="2025-10-06 14:02:31.528016376 +0000 UTC m=+6838.077722141" Oct 06 14:02:33 crc kubenswrapper[4892]: E1006 14:02:33.832044 4892 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.144:52412->38.102.83.144:40237: write tcp 38.102.83.144:52412->38.102.83.144:40237: write: broken pipe Oct 06 14:02:34 crc kubenswrapper[4892]: E1006 14:02:34.065808 4892 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.144:52478->38.102.83.144:40237: write tcp 38.102.83.144:52478->38.102.83.144:40237: write: broken pipe Oct 06 14:02:34 crc kubenswrapper[4892]: I1006 14:02:34.777338 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-7knrj"] Oct 06 14:02:34 crc kubenswrapper[4892]: I1006 14:02:34.778850 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:02:34 crc kubenswrapper[4892]: I1006 14:02:34.977918 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djfgx\" (UniqueName: \"kubernetes.io/projected/24f5854d-3bfb-4184-8aeb-e413a3812889-kube-api-access-djfgx\") pod \"crc-debug-7knrj\" (UID: \"24f5854d-3bfb-4184-8aeb-e413a3812889\") " pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:02:34 crc kubenswrapper[4892]: I1006 14:02:34.979208 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24f5854d-3bfb-4184-8aeb-e413a3812889-host\") pod \"crc-debug-7knrj\" (UID: \"24f5854d-3bfb-4184-8aeb-e413a3812889\") " pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:02:35 crc kubenswrapper[4892]: I1006 14:02:35.080595 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24f5854d-3bfb-4184-8aeb-e413a3812889-host\") pod \"crc-debug-7knrj\" (UID: \"24f5854d-3bfb-4184-8aeb-e413a3812889\") " pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:02:35 crc kubenswrapper[4892]: I1006 14:02:35.080683 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djfgx\" (UniqueName: \"kubernetes.io/projected/24f5854d-3bfb-4184-8aeb-e413a3812889-kube-api-access-djfgx\") pod \"crc-debug-7knrj\" (UID: \"24f5854d-3bfb-4184-8aeb-e413a3812889\") " pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:02:35 crc kubenswrapper[4892]: I1006 14:02:35.080739 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24f5854d-3bfb-4184-8aeb-e413a3812889-host\") pod \"crc-debug-7knrj\" (UID: \"24f5854d-3bfb-4184-8aeb-e413a3812889\") " pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:02:35 crc kubenswrapper[4892]: I1006 14:02:35.106108 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djfgx\" (UniqueName: \"kubernetes.io/projected/24f5854d-3bfb-4184-8aeb-e413a3812889-kube-api-access-djfgx\") pod \"crc-debug-7knrj\" (UID: \"24f5854d-3bfb-4184-8aeb-e413a3812889\") " pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:02:35 crc kubenswrapper[4892]: I1006 14:02:35.168598 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:02:35 crc kubenswrapper[4892]: E1006 14:02:35.168880 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:02:35 crc kubenswrapper[4892]: I1006 14:02:35.399258 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:02:35 crc kubenswrapper[4892]: I1006 14:02:35.549073 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-7knrj" event={"ID":"24f5854d-3bfb-4184-8aeb-e413a3812889","Type":"ContainerStarted","Data":"248b0440d9ccaba0420d4fb8b8013be17662a1db1b307e605cb83ad2d974cb25"} Oct 06 14:02:45 crc kubenswrapper[4892]: I1006 14:02:45.650950 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-7knrj" event={"ID":"24f5854d-3bfb-4184-8aeb-e413a3812889","Type":"ContainerStarted","Data":"b686b1df9f7a3104fb59640b473aab6d49948982d1bed01e1babc271e1552a5c"} Oct 06 14:02:45 crc kubenswrapper[4892]: I1006 14:02:45.665017 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ktz9n/crc-debug-7knrj" podStartSLOduration=1.923685798 podStartE2EDuration="11.664999013s" podCreationTimestamp="2025-10-06 14:02:34 +0000 UTC" firstStartedPulling="2025-10-06 14:02:35.438719264 +0000 UTC m=+6841.988425029" lastFinishedPulling="2025-10-06 14:02:45.180032479 +0000 UTC m=+6851.729738244" observedRunningTime="2025-10-06 14:02:45.66246808 +0000 UTC m=+6852.212173865" watchObservedRunningTime="2025-10-06 14:02:45.664999013 +0000 UTC m=+6852.214704778" Oct 06 14:02:48 crc kubenswrapper[4892]: I1006 14:02:48.169080 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:02:48 crc kubenswrapper[4892]: E1006 14:02:48.170755 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:03:01 crc kubenswrapper[4892]: I1006 14:03:01.168858 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:03:01 crc kubenswrapper[4892]: E1006 14:03:01.169563 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:03:14 crc kubenswrapper[4892]: I1006 14:03:14.175370 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:03:14 crc kubenswrapper[4892]: E1006 14:03:14.175987 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:03:28 crc kubenswrapper[4892]: I1006 14:03:28.172983 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:03:28 crc kubenswrapper[4892]: E1006 14:03:28.173673 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.664604 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8cgh4"] Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.668022 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.680872 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cgh4"] Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.813840 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-utilities\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.813910 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkchd\" (UniqueName: \"kubernetes.io/projected/e9922289-1f8b-49be-a360-7cdd62a8039a-kube-api-access-zkchd\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.814008 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-catalog-content\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.915263 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-utilities\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.915316 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkchd\" (UniqueName: \"kubernetes.io/projected/e9922289-1f8b-49be-a360-7cdd62a8039a-kube-api-access-zkchd\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.915409 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-catalog-content\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.915922 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-catalog-content\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.915926 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-utilities\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:29 crc kubenswrapper[4892]: I1006 14:03:29.939607 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkchd\" (UniqueName: \"kubernetes.io/projected/e9922289-1f8b-49be-a360-7cdd62a8039a-kube-api-access-zkchd\") pod \"community-operators-8cgh4\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:30 crc kubenswrapper[4892]: I1006 14:03:30.036678 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:30 crc kubenswrapper[4892]: I1006 14:03:30.632317 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cgh4"] Oct 06 14:03:31 crc kubenswrapper[4892]: I1006 14:03:31.083101 4892 generic.go:334] "Generic (PLEG): container finished" podID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerID="61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020" exitCode=0 Oct 06 14:03:31 crc kubenswrapper[4892]: I1006 14:03:31.083301 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cgh4" event={"ID":"e9922289-1f8b-49be-a360-7cdd62a8039a","Type":"ContainerDied","Data":"61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020"} Oct 06 14:03:31 crc kubenswrapper[4892]: I1006 14:03:31.083364 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cgh4" event={"ID":"e9922289-1f8b-49be-a360-7cdd62a8039a","Type":"ContainerStarted","Data":"d3e8aba7f47266eb7e8e0965bcc765a40c29f77ceb5af6940a2b7eb31e017c18"} Oct 06 14:03:33 crc kubenswrapper[4892]: I1006 14:03:33.110157 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cgh4" event={"ID":"e9922289-1f8b-49be-a360-7cdd62a8039a","Type":"ContainerStarted","Data":"4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b"} Oct 06 14:03:36 crc kubenswrapper[4892]: I1006 14:03:36.139015 4892 generic.go:334] "Generic (PLEG): container finished" podID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerID="4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b" exitCode=0 Oct 06 14:03:36 crc kubenswrapper[4892]: I1006 14:03:36.139094 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cgh4" event={"ID":"e9922289-1f8b-49be-a360-7cdd62a8039a","Type":"ContainerDied","Data":"4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b"} Oct 06 14:03:36 crc kubenswrapper[4892]: I1006 14:03:36.141261 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:03:37 crc kubenswrapper[4892]: I1006 14:03:37.172611 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cgh4" event={"ID":"e9922289-1f8b-49be-a360-7cdd62a8039a","Type":"ContainerStarted","Data":"78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928"} Oct 06 14:03:37 crc kubenswrapper[4892]: I1006 14:03:37.194170 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8cgh4" podStartSLOduration=2.484335559 podStartE2EDuration="8.194153729s" podCreationTimestamp="2025-10-06 14:03:29 +0000 UTC" firstStartedPulling="2025-10-06 14:03:31.084823853 +0000 UTC m=+6897.634529618" lastFinishedPulling="2025-10-06 14:03:36.794642023 +0000 UTC m=+6903.344347788" observedRunningTime="2025-10-06 14:03:37.192247084 +0000 UTC m=+6903.741952849" watchObservedRunningTime="2025-10-06 14:03:37.194153729 +0000 UTC m=+6903.743859494" Oct 06 14:03:40 crc kubenswrapper[4892]: I1006 14:03:40.037134 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:40 crc kubenswrapper[4892]: I1006 14:03:40.037712 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:41 crc kubenswrapper[4892]: I1006 14:03:41.103402 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8cgh4" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="registry-server" probeResult="failure" output=< Oct 06 14:03:41 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Oct 06 14:03:41 crc kubenswrapper[4892]: > Oct 06 14:03:43 crc kubenswrapper[4892]: I1006 14:03:43.168807 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:03:43 crc kubenswrapper[4892]: E1006 14:03:43.169719 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:03:50 crc kubenswrapper[4892]: I1006 14:03:50.087691 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:50 crc kubenswrapper[4892]: I1006 14:03:50.138613 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:50 crc kubenswrapper[4892]: I1006 14:03:50.324902 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cgh4"] Oct 06 14:03:51 crc kubenswrapper[4892]: I1006 14:03:51.311815 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8cgh4" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="registry-server" containerID="cri-o://78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928" gracePeriod=2 Oct 06 14:03:51 crc kubenswrapper[4892]: I1006 14:03:51.804542 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:51 crc kubenswrapper[4892]: I1006 14:03:51.975463 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkchd\" (UniqueName: \"kubernetes.io/projected/e9922289-1f8b-49be-a360-7cdd62a8039a-kube-api-access-zkchd\") pod \"e9922289-1f8b-49be-a360-7cdd62a8039a\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " Oct 06 14:03:51 crc kubenswrapper[4892]: I1006 14:03:51.975694 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-utilities\") pod \"e9922289-1f8b-49be-a360-7cdd62a8039a\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " Oct 06 14:03:51 crc kubenswrapper[4892]: I1006 14:03:51.975744 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-catalog-content\") pod \"e9922289-1f8b-49be-a360-7cdd62a8039a\" (UID: \"e9922289-1f8b-49be-a360-7cdd62a8039a\") " Oct 06 14:03:51 crc kubenswrapper[4892]: I1006 14:03:51.976533 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-utilities" (OuterVolumeSpecName: "utilities") pod "e9922289-1f8b-49be-a360-7cdd62a8039a" (UID: "e9922289-1f8b-49be-a360-7cdd62a8039a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:03:51 crc kubenswrapper[4892]: I1006 14:03:51.982528 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9922289-1f8b-49be-a360-7cdd62a8039a-kube-api-access-zkchd" (OuterVolumeSpecName: "kube-api-access-zkchd") pod "e9922289-1f8b-49be-a360-7cdd62a8039a" (UID: "e9922289-1f8b-49be-a360-7cdd62a8039a"). InnerVolumeSpecName "kube-api-access-zkchd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.026254 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9922289-1f8b-49be-a360-7cdd62a8039a" (UID: "e9922289-1f8b-49be-a360-7cdd62a8039a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.078044 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.078083 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9922289-1f8b-49be-a360-7cdd62a8039a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.078093 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkchd\" (UniqueName: \"kubernetes.io/projected/e9922289-1f8b-49be-a360-7cdd62a8039a-kube-api-access-zkchd\") on node \"crc\" DevicePath \"\"" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.326752 4892 generic.go:334] "Generic (PLEG): container finished" podID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerID="78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928" exitCode=0 Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.326818 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cgh4" event={"ID":"e9922289-1f8b-49be-a360-7cdd62a8039a","Type":"ContainerDied","Data":"78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928"} Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.326859 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cgh4" event={"ID":"e9922289-1f8b-49be-a360-7cdd62a8039a","Type":"ContainerDied","Data":"d3e8aba7f47266eb7e8e0965bcc765a40c29f77ceb5af6940a2b7eb31e017c18"} Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.326886 4892 scope.go:117] "RemoveContainer" containerID="78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.327067 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cgh4" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.362720 4892 scope.go:117] "RemoveContainer" containerID="4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.372147 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cgh4"] Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.382538 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8cgh4"] Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.389541 4892 scope.go:117] "RemoveContainer" containerID="61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.475561 4892 scope.go:117] "RemoveContainer" containerID="78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928" Oct 06 14:03:52 crc kubenswrapper[4892]: E1006 14:03:52.476058 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928\": container with ID starting with 78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928 not found: ID does not exist" containerID="78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.476127 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928"} err="failed to get container status \"78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928\": rpc error: code = NotFound desc = could not find container \"78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928\": container with ID starting with 78bf0452c34bede78fedeb4c87812e27bf196bfa9a3a906e35566b6d4c13a928 not found: ID does not exist" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.476154 4892 scope.go:117] "RemoveContainer" containerID="4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b" Oct 06 14:03:52 crc kubenswrapper[4892]: E1006 14:03:52.476424 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b\": container with ID starting with 4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b not found: ID does not exist" containerID="4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.476449 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b"} err="failed to get container status \"4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b\": rpc error: code = NotFound desc = could not find container \"4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b\": container with ID starting with 4c741860b4ae8f935e6da4a5236ecf8619f3d94ef6ea757c80723d52243a8a0b not found: ID does not exist" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.476468 4892 scope.go:117] "RemoveContainer" containerID="61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020" Oct 06 14:03:52 crc kubenswrapper[4892]: E1006 14:03:52.476766 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020\": container with ID starting with 61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020 not found: ID does not exist" containerID="61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020" Oct 06 14:03:52 crc kubenswrapper[4892]: I1006 14:03:52.476795 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020"} err="failed to get container status \"61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020\": rpc error: code = NotFound desc = could not find container \"61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020\": container with ID starting with 61d7b1c78f5293452690bb27cf4d206710a31a86b939295c958c4ace23682020 not found: ID does not exist" Oct 06 14:03:54 crc kubenswrapper[4892]: I1006 14:03:54.188868 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" path="/var/lib/kubelet/pods/e9922289-1f8b-49be-a360-7cdd62a8039a/volumes" Oct 06 14:03:54 crc kubenswrapper[4892]: I1006 14:03:54.839441 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-54cccc9b7d-psd8k_fd885701-3239-4926-8393-6e671e0f3b22/barbican-api/0.log" Oct 06 14:03:54 crc kubenswrapper[4892]: I1006 14:03:54.921718 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-54cccc9b7d-psd8k_fd885701-3239-4926-8393-6e671e0f3b22/barbican-api-log/0.log" Oct 06 14:03:55 crc kubenswrapper[4892]: I1006 14:03:55.082128 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f964b9fb4-v4lhn_eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a/barbican-keystone-listener/0.log" Oct 06 14:03:55 crc kubenswrapper[4892]: I1006 14:03:55.190103 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f964b9fb4-v4lhn_eb6b7c5b-dc29-4fe6-86ec-f24e9faf261a/barbican-keystone-listener-log/0.log" Oct 06 14:03:55 crc kubenswrapper[4892]: I1006 14:03:55.281849 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6984f8c567-2sx4r_72c02562-a51b-42c5-8797-fe351a4932f7/barbican-worker/0.log" Oct 06 14:03:55 crc kubenswrapper[4892]: I1006 14:03:55.379727 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6984f8c567-2sx4r_72c02562-a51b-42c5-8797-fe351a4932f7/barbican-worker-log/0.log" Oct 06 14:03:55 crc kubenswrapper[4892]: I1006 14:03:55.549018 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-d8cxv_de624448-d17e-48b7-a11b-bcbd70fa860f/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:03:55 crc kubenswrapper[4892]: I1006 14:03:55.832581 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_085bae0e-dea7-4bb8-8cf9-3855eff9336c/ceilometer-notification-agent/0.log" Oct 06 14:03:55 crc kubenswrapper[4892]: I1006 14:03:55.874884 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_085bae0e-dea7-4bb8-8cf9-3855eff9336c/ceilometer-central-agent/0.log" Oct 06 14:03:55 crc kubenswrapper[4892]: I1006 14:03:55.912916 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_085bae0e-dea7-4bb8-8cf9-3855eff9336c/proxy-httpd/0.log" Oct 06 14:03:56 crc kubenswrapper[4892]: I1006 14:03:56.045678 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_085bae0e-dea7-4bb8-8cf9-3855eff9336c/sg-core/0.log" Oct 06 14:03:56 crc kubenswrapper[4892]: I1006 14:03:56.175292 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:03:56 crc kubenswrapper[4892]: E1006 14:03:56.175648 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:03:56 crc kubenswrapper[4892]: I1006 14:03:56.330370 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9f76a315-cedc-4ab9-9838-6a58823db3e2/cinder-api-log/0.log" Oct 06 14:03:56 crc kubenswrapper[4892]: I1006 14:03:56.361835 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9f76a315-cedc-4ab9-9838-6a58823db3e2/cinder-api/0.log" Oct 06 14:03:56 crc kubenswrapper[4892]: I1006 14:03:56.524256 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_22e3a220-0262-4414-a93c-0da5d9d8cce3/cinder-scheduler/0.log" Oct 06 14:03:56 crc kubenswrapper[4892]: I1006 14:03:56.593417 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_22e3a220-0262-4414-a93c-0da5d9d8cce3/probe/0.log" Oct 06 14:03:56 crc kubenswrapper[4892]: I1006 14:03:56.715769 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-sf6g5_f1cd3b99-5754-49c8-bd71-2cc2e37ab7ed/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:03:56 crc kubenswrapper[4892]: I1006 14:03:56.881658 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-bj8j8_6665486e-c1dd-4b2d-96d8-5dd9140dc21e/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:03:57 crc kubenswrapper[4892]: I1006 14:03:57.034945 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kstlq_57b29956-a9eb-4ca8-b130-a67dfdf2190b/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:03:57 crc kubenswrapper[4892]: I1006 14:03:57.210641 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7f784f866c-b2xx9_aa0aa236-82a9-4c3c-9cdb-49515c29093d/init/0.log" Oct 06 14:03:57 crc kubenswrapper[4892]: I1006 14:03:57.376944 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7f784f866c-b2xx9_aa0aa236-82a9-4c3c-9cdb-49515c29093d/init/0.log" Oct 06 14:03:57 crc kubenswrapper[4892]: I1006 14:03:57.527058 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7f784f866c-b2xx9_aa0aa236-82a9-4c3c-9cdb-49515c29093d/dnsmasq-dns/0.log" Oct 06 14:03:57 crc kubenswrapper[4892]: I1006 14:03:57.648193 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-lgwsl_14a48578-0a22-4ebc-b227-a5fa1ffca71a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:03:57 crc kubenswrapper[4892]: I1006 14:03:57.773864 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7995e823-ea1a-4dce-bd8b-693d5e835a10/glance-httpd/0.log" Oct 06 14:03:57 crc kubenswrapper[4892]: I1006 14:03:57.835260 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7995e823-ea1a-4dce-bd8b-693d5e835a10/glance-log/0.log" Oct 06 14:03:57 crc kubenswrapper[4892]: I1006 14:03:57.988079 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_972058aa-0e97-4041-96c6-41acbb31d3ce/glance-httpd/0.log" Oct 06 14:03:58 crc kubenswrapper[4892]: I1006 14:03:58.059062 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_972058aa-0e97-4041-96c6-41acbb31d3ce/glance-log/0.log" Oct 06 14:03:58 crc kubenswrapper[4892]: I1006 14:03:58.349837 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6c68c58656-gbbdd_f038239e-35e8-4409-a858-d7aad410f5fd/horizon/0.log" Oct 06 14:03:58 crc kubenswrapper[4892]: I1006 14:03:58.453538 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-zzphp_68286c2d-3cd6-43fd-943d-b2156e1253a5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:03:58 crc kubenswrapper[4892]: I1006 14:03:58.582918 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-jw9jh_c8b52544-e7f5-4cab-9b11-1bf028d07c61/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:03:58 crc kubenswrapper[4892]: I1006 14:03:58.976992 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29329261-8fz25_5d32494b-495f-4f2b-bffd-e514f409a5fd/keystone-cron/0.log" Oct 06 14:03:59 crc kubenswrapper[4892]: I1006 14:03:59.169696 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29329321-qw7ls_4e85ae2a-90b8-44f1-9e60-38580ae44fb0/keystone-cron/0.log" Oct 06 14:03:59 crc kubenswrapper[4892]: I1006 14:03:59.189713 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6c68c58656-gbbdd_f038239e-35e8-4409-a858-d7aad410f5fd/horizon-log/0.log" Oct 06 14:03:59 crc kubenswrapper[4892]: I1006 14:03:59.375152 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5bd5b56c96-qhdwx_6493eabc-ee54-41dd-a9e4-dfa55fe71dd1/keystone-api/0.log" Oct 06 14:03:59 crc kubenswrapper[4892]: I1006 14:03:59.418075 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_fbf35d1d-7eb9-4c7d-8f54-278f490eaf4f/kube-state-metrics/0.log" Oct 06 14:03:59 crc kubenswrapper[4892]: I1006 14:03:59.576370 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6njgq_aa2cf17a-4ca9-414f-9421-46fa314679d0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:00 crc kubenswrapper[4892]: I1006 14:04:00.012222 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-696599f45c-ndwj5_7ee89f5c-ce92-4b17-82b5-7b8be64fa649/neutron-httpd/0.log" Oct 06 14:04:00 crc kubenswrapper[4892]: I1006 14:04:00.024914 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-696599f45c-ndwj5_7ee89f5c-ce92-4b17-82b5-7b8be64fa649/neutron-api/0.log" Oct 06 14:04:00 crc kubenswrapper[4892]: I1006 14:04:00.273529 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-rvzpk_4341f2a6-b1f7-453a-84db-a4ba1888c381/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:00 crc kubenswrapper[4892]: I1006 14:04:00.884474 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_e8ca03eb-d66d-4149-9b53-aeefb4d1478c/nova-cell0-conductor-conductor/0.log" Oct 06 14:04:01 crc kubenswrapper[4892]: I1006 14:04:01.417881 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4d101022-48e0-4666-b28f-e4dd08f380ad/nova-cell1-conductor-conductor/0.log" Oct 06 14:04:01 crc kubenswrapper[4892]: I1006 14:04:01.791704 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b87347f7-4f43-42da-9a0d-59fca9e193c5/nova-api-log/0.log" Oct 06 14:04:02 crc kubenswrapper[4892]: I1006 14:04:02.026734 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_67c67192-e4bc-41c1-894c-692a7641934b/nova-cell1-novncproxy-novncproxy/0.log" Oct 06 14:04:02 crc kubenswrapper[4892]: I1006 14:04:02.147050 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b87347f7-4f43-42da-9a0d-59fca9e193c5/nova-api-api/0.log" Oct 06 14:04:02 crc kubenswrapper[4892]: I1006 14:04:02.459426 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-rjll6_06eb266c-79fd-49cd-9071-1ed4446e94d6/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:02 crc kubenswrapper[4892]: I1006 14:04:02.519245 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a5354ced-b54e-4a88-934b-7bebbacccf1e/nova-metadata-log/0.log" Oct 06 14:04:02 crc kubenswrapper[4892]: I1006 14:04:02.934005 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c1f22bc9-62b7-4c4e-b794-83a9f1a54f3c/nova-scheduler-scheduler/0.log" Oct 06 14:04:03 crc kubenswrapper[4892]: I1006 14:04:03.139637 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_19207559-7eb7-49b5-9b73-0641f426ab63/mysql-bootstrap/0.log" Oct 06 14:04:03 crc kubenswrapper[4892]: I1006 14:04:03.349532 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_19207559-7eb7-49b5-9b73-0641f426ab63/mysql-bootstrap/0.log" Oct 06 14:04:03 crc kubenswrapper[4892]: I1006 14:04:03.365668 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_19207559-7eb7-49b5-9b73-0641f426ab63/galera/0.log" Oct 06 14:04:03 crc kubenswrapper[4892]: I1006 14:04:03.607287 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_728515b5-40b3-48f4-8452-85ce84a9930a/mysql-bootstrap/0.log" Oct 06 14:04:03 crc kubenswrapper[4892]: I1006 14:04:03.817869 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_728515b5-40b3-48f4-8452-85ce84a9930a/mysql-bootstrap/0.log" Oct 06 14:04:03 crc kubenswrapper[4892]: I1006 14:04:03.834951 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_728515b5-40b3-48f4-8452-85ce84a9930a/galera/0.log" Oct 06 14:04:04 crc kubenswrapper[4892]: I1006 14:04:04.059487 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_88899d7e-63bd-4092-a2e5-81974383d714/openstackclient/0.log" Oct 06 14:04:04 crc kubenswrapper[4892]: I1006 14:04:04.261658 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-l6mw2_cc5ba4f2-f4c1-46af-843e-6d3d4e8556e8/ovn-controller/0.log" Oct 06 14:04:04 crc kubenswrapper[4892]: I1006 14:04:04.483484 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xh7kj_ac7b9745-1f4f-4a0f-8803-4ecd222fd160/openstack-network-exporter/0.log" Oct 06 14:04:04 crc kubenswrapper[4892]: I1006 14:04:04.689170 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qtgzv_20f13b95-c224-4a4d-acd3-ad229e3223fb/ovsdb-server-init/0.log" Oct 06 14:04:04 crc kubenswrapper[4892]: I1006 14:04:04.919672 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qtgzv_20f13b95-c224-4a4d-acd3-ad229e3223fb/ovsdb-server-init/0.log" Oct 06 14:04:05 crc kubenswrapper[4892]: I1006 14:04:05.131280 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qtgzv_20f13b95-c224-4a4d-acd3-ad229e3223fb/ovsdb-server/0.log" Oct 06 14:04:05 crc kubenswrapper[4892]: I1006 14:04:05.206744 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qtgzv_20f13b95-c224-4a4d-acd3-ad229e3223fb/ovs-vswitchd/0.log" Oct 06 14:04:05 crc kubenswrapper[4892]: I1006 14:04:05.454164 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-zwdjq_073e303e-602b-4cce-b0c5-f9da295a63a4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:05 crc kubenswrapper[4892]: I1006 14:04:05.639535 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a5354ced-b54e-4a88-934b-7bebbacccf1e/nova-metadata-metadata/0.log" Oct 06 14:04:05 crc kubenswrapper[4892]: I1006 14:04:05.687372 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d204094-c3c0-4f10-8668-731e258b54f6/openstack-network-exporter/0.log" Oct 06 14:04:05 crc kubenswrapper[4892]: I1006 14:04:05.813839 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_1d204094-c3c0-4f10-8668-731e258b54f6/ovn-northd/0.log" Oct 06 14:04:05 crc kubenswrapper[4892]: I1006 14:04:05.879386 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_516888f8-ccb5-4bbc-b11d-c09a8dbd19b1/openstack-network-exporter/0.log" Oct 06 14:04:06 crc kubenswrapper[4892]: I1006 14:04:06.039489 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_516888f8-ccb5-4bbc-b11d-c09a8dbd19b1/ovsdbserver-nb/0.log" Oct 06 14:04:06 crc kubenswrapper[4892]: I1006 14:04:06.212051 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ae4b1e93-1c14-436a-84e3-7d9359228563/openstack-network-exporter/0.log" Oct 06 14:04:06 crc kubenswrapper[4892]: I1006 14:04:06.331372 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ae4b1e93-1c14-436a-84e3-7d9359228563/ovsdbserver-sb/0.log" Oct 06 14:04:06 crc kubenswrapper[4892]: I1006 14:04:06.788129 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-586dc66ccd-lmkkz_e20d1322-c08c-4173-95f0-146b3c2cce04/placement-api/0.log" Oct 06 14:04:06 crc kubenswrapper[4892]: I1006 14:04:06.839613 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-586dc66ccd-lmkkz_e20d1322-c08c-4173-95f0-146b3c2cce04/placement-log/0.log" Oct 06 14:04:06 crc kubenswrapper[4892]: I1006 14:04:06.941777 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f636c8ba-cc7f-420c-8847-ad1ecf766974/init-config-reloader/0.log" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.141809 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f636c8ba-cc7f-420c-8847-ad1ecf766974/config-reloader/0.log" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.143416 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f636c8ba-cc7f-420c-8847-ad1ecf766974/init-config-reloader/0.log" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.170224 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:04:07 crc kubenswrapper[4892]: E1006 14:04:07.170763 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.192574 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f636c8ba-cc7f-420c-8847-ad1ecf766974/prometheus/0.log" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.449821 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f636c8ba-cc7f-420c-8847-ad1ecf766974/thanos-sidecar/0.log" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.543908 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1ea9e651-a19b-445b-96dc-fd25c0df95f2/setup-container/0.log" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.743675 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1ea9e651-a19b-445b-96dc-fd25c0df95f2/setup-container/0.log" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.805583 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_1ea9e651-a19b-445b-96dc-fd25c0df95f2/rabbitmq/0.log" Oct 06 14:04:07 crc kubenswrapper[4892]: I1006 14:04:07.973046 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_000efd26-a8c0-4668-9603-9ee7a9aed0ed/setup-container/0.log" Oct 06 14:04:08 crc kubenswrapper[4892]: I1006 14:04:08.206137 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_000efd26-a8c0-4668-9603-9ee7a9aed0ed/setup-container/0.log" Oct 06 14:04:08 crc kubenswrapper[4892]: I1006 14:04:08.280645 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_000efd26-a8c0-4668-9603-9ee7a9aed0ed/rabbitmq/0.log" Oct 06 14:04:08 crc kubenswrapper[4892]: I1006 14:04:08.426589 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b11ee41a-0493-4955-b081-d78b83730ec4/setup-container/0.log" Oct 06 14:04:08 crc kubenswrapper[4892]: I1006 14:04:08.621431 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b11ee41a-0493-4955-b081-d78b83730ec4/setup-container/0.log" Oct 06 14:04:08 crc kubenswrapper[4892]: I1006 14:04:08.682894 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b11ee41a-0493-4955-b081-d78b83730ec4/rabbitmq/0.log" Oct 06 14:04:08 crc kubenswrapper[4892]: I1006 14:04:08.819308 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-rjr2c_7ae373d0-3872-4bdb-ab95-af7aba741dbc/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:09 crc kubenswrapper[4892]: I1006 14:04:09.054611 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-wrdgl_4f608c5b-99de-42b3-83c7-9a514aa5e54b/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:09 crc kubenswrapper[4892]: I1006 14:04:09.159866 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hc2w4_9b014a50-d437-4fd0-9d31-aff86fbf851c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:09 crc kubenswrapper[4892]: I1006 14:04:09.366038 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fksz8_ecf9ad4c-db82-467f-9fae-bc74b2e7c912/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:09 crc kubenswrapper[4892]: I1006 14:04:09.647127 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-b7wmh_cac6d064-9f38-40e2-aa4a-ad2af08245c3/ssh-known-hosts-edpm-deployment/0.log" Oct 06 14:04:09 crc kubenswrapper[4892]: I1006 14:04:09.945949 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57bbd8d677-4mwpb_9b16ec0c-fdde-42a8-9a45-da67ecd56360/proxy-server/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.083521 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-57bbd8d677-4mwpb_9b16ec0c-fdde-42a8-9a45-da67ecd56360/proxy-httpd/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.186362 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-xjptv_a045df0b-e5d4-4e68-b29f-47e270efa265/swift-ring-rebalance/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.332725 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/account-auditor/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.407504 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/account-reaper/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.431756 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/account-replicator/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.591084 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/account-server/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.640045 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/container-auditor/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.720535 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/container-replicator/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.808956 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/container-server/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.840002 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/container-updater/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.971186 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/object-auditor/0.log" Oct 06 14:04:10 crc kubenswrapper[4892]: I1006 14:04:10.997934 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/object-expirer/0.log" Oct 06 14:04:11 crc kubenswrapper[4892]: I1006 14:04:11.104406 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/object-replicator/0.log" Oct 06 14:04:11 crc kubenswrapper[4892]: I1006 14:04:11.167596 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/object-server/0.log" Oct 06 14:04:11 crc kubenswrapper[4892]: I1006 14:04:11.229656 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/object-updater/0.log" Oct 06 14:04:11 crc kubenswrapper[4892]: I1006 14:04:11.342094 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/rsync/0.log" Oct 06 14:04:11 crc kubenswrapper[4892]: I1006 14:04:11.441782 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_9f90d8be-05b7-4668-be7c-1494621a363b/swift-recon-cron/0.log" Oct 06 14:04:11 crc kubenswrapper[4892]: I1006 14:04:11.625530 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-ktp7f_ab077a9a-134b-497a-abce-777fb1303160/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:12 crc kubenswrapper[4892]: I1006 14:04:12.003217 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_2443f61e-fb23-4eeb-9e36-7ee51d31b322/tempest-tests-tempest-tests-runner/0.log" Oct 06 14:04:12 crc kubenswrapper[4892]: I1006 14:04:12.006398 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-j5t4n_392cffb3-245d-4f4a-86eb-81e59a488996/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 06 14:04:13 crc kubenswrapper[4892]: I1006 14:04:13.223806 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_10c6d4d0-3a47-4756-a5bb-65ff70e4c677/watcher-applier/0.log" Oct 06 14:04:14 crc kubenswrapper[4892]: I1006 14:04:14.288728 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_230bad23-f208-488d-90ae-dcdf6c56fa64/watcher-api-log/0.log" Oct 06 14:04:18 crc kubenswrapper[4892]: I1006 14:04:18.860026 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_8aa53cf5-94c5-483e-9ef0-bf823a8abff7/watcher-decision-engine/0.log" Oct 06 14:04:19 crc kubenswrapper[4892]: I1006 14:04:19.680357 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_230bad23-f208-488d-90ae-dcdf6c56fa64/watcher-api/0.log" Oct 06 14:04:22 crc kubenswrapper[4892]: I1006 14:04:22.171373 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:04:22 crc kubenswrapper[4892]: E1006 14:04:22.172106 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:04:22 crc kubenswrapper[4892]: I1006 14:04:22.912989 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_82b202ad-f5d6-406b-9821-3a4a18c795eb/memcached/0.log" Oct 06 14:04:35 crc kubenswrapper[4892]: I1006 14:04:35.168649 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:04:35 crc kubenswrapper[4892]: E1006 14:04:35.169708 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:04:47 crc kubenswrapper[4892]: I1006 14:04:47.169179 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:04:47 crc kubenswrapper[4892]: E1006 14:04:47.170219 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:04:59 crc kubenswrapper[4892]: I1006 14:04:59.169787 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:04:59 crc kubenswrapper[4892]: E1006 14:04:59.170921 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:05:04 crc kubenswrapper[4892]: I1006 14:05:04.083974 4892 generic.go:334] "Generic (PLEG): container finished" podID="24f5854d-3bfb-4184-8aeb-e413a3812889" containerID="b686b1df9f7a3104fb59640b473aab6d49948982d1bed01e1babc271e1552a5c" exitCode=0 Oct 06 14:05:04 crc kubenswrapper[4892]: I1006 14:05:04.084402 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-7knrj" event={"ID":"24f5854d-3bfb-4184-8aeb-e413a3812889","Type":"ContainerDied","Data":"b686b1df9f7a3104fb59640b473aab6d49948982d1bed01e1babc271e1552a5c"} Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.217562 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.251499 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-7knrj"] Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.259245 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-7knrj"] Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.368856 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24f5854d-3bfb-4184-8aeb-e413a3812889-host\") pod \"24f5854d-3bfb-4184-8aeb-e413a3812889\" (UID: \"24f5854d-3bfb-4184-8aeb-e413a3812889\") " Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.368932 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djfgx\" (UniqueName: \"kubernetes.io/projected/24f5854d-3bfb-4184-8aeb-e413a3812889-kube-api-access-djfgx\") pod \"24f5854d-3bfb-4184-8aeb-e413a3812889\" (UID: \"24f5854d-3bfb-4184-8aeb-e413a3812889\") " Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.369297 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24f5854d-3bfb-4184-8aeb-e413a3812889-host" (OuterVolumeSpecName: "host") pod "24f5854d-3bfb-4184-8aeb-e413a3812889" (UID: "24f5854d-3bfb-4184-8aeb-e413a3812889"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.374541 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f5854d-3bfb-4184-8aeb-e413a3812889-kube-api-access-djfgx" (OuterVolumeSpecName: "kube-api-access-djfgx") pod "24f5854d-3bfb-4184-8aeb-e413a3812889" (UID: "24f5854d-3bfb-4184-8aeb-e413a3812889"). InnerVolumeSpecName "kube-api-access-djfgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.471557 4892 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/24f5854d-3bfb-4184-8aeb-e413a3812889-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:05 crc kubenswrapper[4892]: I1006 14:05:05.471810 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djfgx\" (UniqueName: \"kubernetes.io/projected/24f5854d-3bfb-4184-8aeb-e413a3812889-kube-api-access-djfgx\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.110506 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="248b0440d9ccaba0420d4fb8b8013be17662a1db1b307e605cb83ad2d974cb25" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.110587 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-7knrj" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.182657 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f5854d-3bfb-4184-8aeb-e413a3812889" path="/var/lib/kubelet/pods/24f5854d-3bfb-4184-8aeb-e413a3812889/volumes" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.435171 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-ksjts"] Oct 06 14:05:06 crc kubenswrapper[4892]: E1006 14:05:06.435808 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="extract-utilities" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.435837 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="extract-utilities" Oct 06 14:05:06 crc kubenswrapper[4892]: E1006 14:05:06.435891 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="registry-server" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.435905 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="registry-server" Oct 06 14:05:06 crc kubenswrapper[4892]: E1006 14:05:06.435938 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="extract-content" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.435955 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="extract-content" Oct 06 14:05:06 crc kubenswrapper[4892]: E1006 14:05:06.435990 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f5854d-3bfb-4184-8aeb-e413a3812889" containerName="container-00" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.436007 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f5854d-3bfb-4184-8aeb-e413a3812889" containerName="container-00" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.436391 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f5854d-3bfb-4184-8aeb-e413a3812889" containerName="container-00" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.436449 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9922289-1f8b-49be-a360-7cdd62a8039a" containerName="registry-server" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.437603 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.598389 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-574k9\" (UniqueName: \"kubernetes.io/projected/0d50e544-71e3-4c86-84d3-ae0bfa14140e-kube-api-access-574k9\") pod \"crc-debug-ksjts\" (UID: \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\") " pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.598463 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d50e544-71e3-4c86-84d3-ae0bfa14140e-host\") pod \"crc-debug-ksjts\" (UID: \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\") " pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.701241 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-574k9\" (UniqueName: \"kubernetes.io/projected/0d50e544-71e3-4c86-84d3-ae0bfa14140e-kube-api-access-574k9\") pod \"crc-debug-ksjts\" (UID: \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\") " pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.701314 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d50e544-71e3-4c86-84d3-ae0bfa14140e-host\") pod \"crc-debug-ksjts\" (UID: \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\") " pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.701600 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d50e544-71e3-4c86-84d3-ae0bfa14140e-host\") pod \"crc-debug-ksjts\" (UID: \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\") " pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.737098 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-574k9\" (UniqueName: \"kubernetes.io/projected/0d50e544-71e3-4c86-84d3-ae0bfa14140e-kube-api-access-574k9\") pod \"crc-debug-ksjts\" (UID: \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\") " pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:06 crc kubenswrapper[4892]: I1006 14:05:06.765680 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:07 crc kubenswrapper[4892]: I1006 14:05:07.120041 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" event={"ID":"0d50e544-71e3-4c86-84d3-ae0bfa14140e","Type":"ContainerStarted","Data":"7075a2006bd0af331be0c83923506fc2b0cc905a2f0c0a5b907b8306d6a23425"} Oct 06 14:05:07 crc kubenswrapper[4892]: I1006 14:05:07.120479 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" event={"ID":"0d50e544-71e3-4c86-84d3-ae0bfa14140e","Type":"ContainerStarted","Data":"a056a4a2db00197b009ca478296f31f9109d5184fa800b384bddef42828b8eaf"} Oct 06 14:05:07 crc kubenswrapper[4892]: I1006 14:05:07.140484 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" podStartSLOduration=1.140466861 podStartE2EDuration="1.140466861s" podCreationTimestamp="2025-10-06 14:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-06 14:05:07.129536055 +0000 UTC m=+6993.679241820" watchObservedRunningTime="2025-10-06 14:05:07.140466861 +0000 UTC m=+6993.690172626" Oct 06 14:05:08 crc kubenswrapper[4892]: I1006 14:05:08.134448 4892 generic.go:334] "Generic (PLEG): container finished" podID="0d50e544-71e3-4c86-84d3-ae0bfa14140e" containerID="7075a2006bd0af331be0c83923506fc2b0cc905a2f0c0a5b907b8306d6a23425" exitCode=0 Oct 06 14:05:08 crc kubenswrapper[4892]: I1006 14:05:08.134491 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" event={"ID":"0d50e544-71e3-4c86-84d3-ae0bfa14140e","Type":"ContainerDied","Data":"7075a2006bd0af331be0c83923506fc2b0cc905a2f0c0a5b907b8306d6a23425"} Oct 06 14:05:09 crc kubenswrapper[4892]: I1006 14:05:09.260291 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:09 crc kubenswrapper[4892]: I1006 14:05:09.351115 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d50e544-71e3-4c86-84d3-ae0bfa14140e-host\") pod \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\" (UID: \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\") " Oct 06 14:05:09 crc kubenswrapper[4892]: I1006 14:05:09.351703 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-574k9\" (UniqueName: \"kubernetes.io/projected/0d50e544-71e3-4c86-84d3-ae0bfa14140e-kube-api-access-574k9\") pod \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\" (UID: \"0d50e544-71e3-4c86-84d3-ae0bfa14140e\") " Oct 06 14:05:09 crc kubenswrapper[4892]: I1006 14:05:09.353086 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d50e544-71e3-4c86-84d3-ae0bfa14140e-host" (OuterVolumeSpecName: "host") pod "0d50e544-71e3-4c86-84d3-ae0bfa14140e" (UID: "0d50e544-71e3-4c86-84d3-ae0bfa14140e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:05:09 crc kubenswrapper[4892]: I1006 14:05:09.355481 4892 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d50e544-71e3-4c86-84d3-ae0bfa14140e-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:09 crc kubenswrapper[4892]: I1006 14:05:09.362724 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d50e544-71e3-4c86-84d3-ae0bfa14140e-kube-api-access-574k9" (OuterVolumeSpecName: "kube-api-access-574k9") pod "0d50e544-71e3-4c86-84d3-ae0bfa14140e" (UID: "0d50e544-71e3-4c86-84d3-ae0bfa14140e"). InnerVolumeSpecName "kube-api-access-574k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:05:09 crc kubenswrapper[4892]: I1006 14:05:09.457080 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-574k9\" (UniqueName: \"kubernetes.io/projected/0d50e544-71e3-4c86-84d3-ae0bfa14140e-kube-api-access-574k9\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:10 crc kubenswrapper[4892]: I1006 14:05:10.152041 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" event={"ID":"0d50e544-71e3-4c86-84d3-ae0bfa14140e","Type":"ContainerDied","Data":"a056a4a2db00197b009ca478296f31f9109d5184fa800b384bddef42828b8eaf"} Oct 06 14:05:10 crc kubenswrapper[4892]: I1006 14:05:10.152336 4892 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a056a4a2db00197b009ca478296f31f9109d5184fa800b384bddef42828b8eaf" Oct 06 14:05:10 crc kubenswrapper[4892]: I1006 14:05:10.152089 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-ksjts" Oct 06 14:05:10 crc kubenswrapper[4892]: I1006 14:05:10.168929 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:05:10 crc kubenswrapper[4892]: E1006 14:05:10.169253 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:05:17 crc kubenswrapper[4892]: I1006 14:05:17.020611 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-ksjts"] Oct 06 14:05:17 crc kubenswrapper[4892]: I1006 14:05:17.028745 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-ksjts"] Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.194986 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d50e544-71e3-4c86-84d3-ae0bfa14140e" path="/var/lib/kubelet/pods/0d50e544-71e3-4c86-84d3-ae0bfa14140e/volumes" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.240487 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-smql2"] Oct 06 14:05:18 crc kubenswrapper[4892]: E1006 14:05:18.240894 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d50e544-71e3-4c86-84d3-ae0bfa14140e" containerName="container-00" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.240909 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d50e544-71e3-4c86-84d3-ae0bfa14140e" containerName="container-00" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.241101 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d50e544-71e3-4c86-84d3-ae0bfa14140e" containerName="container-00" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.241782 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.412587 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd6f012a-12a5-48ba-95b8-9bd09cb71050-host\") pod \"crc-debug-smql2\" (UID: \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\") " pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.413235 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxqkm\" (UniqueName: \"kubernetes.io/projected/bd6f012a-12a5-48ba-95b8-9bd09cb71050-kube-api-access-nxqkm\") pod \"crc-debug-smql2\" (UID: \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\") " pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.514699 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd6f012a-12a5-48ba-95b8-9bd09cb71050-host\") pod \"crc-debug-smql2\" (UID: \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\") " pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.514794 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd6f012a-12a5-48ba-95b8-9bd09cb71050-host\") pod \"crc-debug-smql2\" (UID: \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\") " pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.515256 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxqkm\" (UniqueName: \"kubernetes.io/projected/bd6f012a-12a5-48ba-95b8-9bd09cb71050-kube-api-access-nxqkm\") pod \"crc-debug-smql2\" (UID: \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\") " pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.540728 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxqkm\" (UniqueName: \"kubernetes.io/projected/bd6f012a-12a5-48ba-95b8-9bd09cb71050-kube-api-access-nxqkm\") pod \"crc-debug-smql2\" (UID: \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\") " pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:18 crc kubenswrapper[4892]: I1006 14:05:18.562312 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:19 crc kubenswrapper[4892]: I1006 14:05:19.242524 4892 generic.go:334] "Generic (PLEG): container finished" podID="bd6f012a-12a5-48ba-95b8-9bd09cb71050" containerID="8b32b5ebf8c9a94199b650a51c3243e3d6a28f88a2a31d70a044323e6a383e81" exitCode=0 Oct 06 14:05:19 crc kubenswrapper[4892]: I1006 14:05:19.242606 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-smql2" event={"ID":"bd6f012a-12a5-48ba-95b8-9bd09cb71050","Type":"ContainerDied","Data":"8b32b5ebf8c9a94199b650a51c3243e3d6a28f88a2a31d70a044323e6a383e81"} Oct 06 14:05:19 crc kubenswrapper[4892]: I1006 14:05:19.242778 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/crc-debug-smql2" event={"ID":"bd6f012a-12a5-48ba-95b8-9bd09cb71050","Type":"ContainerStarted","Data":"ecd1d41581a3c7ca9376131179560f48351825ffd9e6cbab99b498c7bcccdf63"} Oct 06 14:05:19 crc kubenswrapper[4892]: I1006 14:05:19.283090 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-smql2"] Oct 06 14:05:19 crc kubenswrapper[4892]: I1006 14:05:19.291216 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ktz9n/crc-debug-smql2"] Oct 06 14:05:20 crc kubenswrapper[4892]: I1006 14:05:20.358713 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:20 crc kubenswrapper[4892]: I1006 14:05:20.450579 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxqkm\" (UniqueName: \"kubernetes.io/projected/bd6f012a-12a5-48ba-95b8-9bd09cb71050-kube-api-access-nxqkm\") pod \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\" (UID: \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\") " Oct 06 14:05:20 crc kubenswrapper[4892]: I1006 14:05:20.450938 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd6f012a-12a5-48ba-95b8-9bd09cb71050-host\") pod \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\" (UID: \"bd6f012a-12a5-48ba-95b8-9bd09cb71050\") " Oct 06 14:05:20 crc kubenswrapper[4892]: I1006 14:05:20.451052 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd6f012a-12a5-48ba-95b8-9bd09cb71050-host" (OuterVolumeSpecName: "host") pod "bd6f012a-12a5-48ba-95b8-9bd09cb71050" (UID: "bd6f012a-12a5-48ba-95b8-9bd09cb71050"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 06 14:05:20 crc kubenswrapper[4892]: I1006 14:05:20.451423 4892 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd6f012a-12a5-48ba-95b8-9bd09cb71050-host\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:20 crc kubenswrapper[4892]: I1006 14:05:20.457940 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd6f012a-12a5-48ba-95b8-9bd09cb71050-kube-api-access-nxqkm" (OuterVolumeSpecName: "kube-api-access-nxqkm") pod "bd6f012a-12a5-48ba-95b8-9bd09cb71050" (UID: "bd6f012a-12a5-48ba-95b8-9bd09cb71050"). InnerVolumeSpecName "kube-api-access-nxqkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:05:20 crc kubenswrapper[4892]: I1006 14:05:20.553198 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxqkm\" (UniqueName: \"kubernetes.io/projected/bd6f012a-12a5-48ba-95b8-9bd09cb71050-kube-api-access-nxqkm\") on node \"crc\" DevicePath \"\"" Oct 06 14:05:20 crc kubenswrapper[4892]: I1006 14:05:20.948166 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f_2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1/util/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.120853 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f_2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1/pull/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.128451 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f_2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1/util/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.139986 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f_2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1/pull/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.263278 4892 scope.go:117] "RemoveContainer" containerID="8b32b5ebf8c9a94199b650a51c3243e3d6a28f88a2a31d70a044323e6a383e81" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.263421 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/crc-debug-smql2" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.369588 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f_2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1/util/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.375111 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f_2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1/pull/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.404229 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b7eb4e80eecc838b1b068308d345e854e29e38e03fad443c9a36677aafnsg5f_2ee60ce0-cf65-4875-a1e5-fc3c4b3542d1/extract/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.516291 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-b6gs7_4c2bced1-04b0-4525-b1d3-c3adc3669b68/kube-rbac-proxy/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.631076 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-b6gs7_4c2bced1-04b0-4525-b1d3-c3adc3669b68/manager/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.713255 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-p2ncw_1de023aa-81a7-4510-a6a3-93010ca572be/kube-rbac-proxy/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.779400 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-p2ncw_1de023aa-81a7-4510-a6a3-93010ca572be/manager/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.849893 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-68ldg_718cc1ac-8554-450f-bca5-5449909339dd/kube-rbac-proxy/0.log" Oct 06 14:05:21 crc kubenswrapper[4892]: I1006 14:05:21.909562 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-68ldg_718cc1ac-8554-450f-bca5-5449909339dd/manager/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.029053 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-gtrv6_773fe79a-1318-46ee-87bc-99786396705c/kube-rbac-proxy/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.117073 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-gtrv6_773fe79a-1318-46ee-87bc-99786396705c/manager/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.147657 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-s8hw9_ed10a250-1c0b-4fc4-9906-6e01dba78a1e/kube-rbac-proxy/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.169429 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:05:22 crc kubenswrapper[4892]: E1006 14:05:22.169701 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.181731 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6f012a-12a5-48ba-95b8-9bd09cb71050" path="/var/lib/kubelet/pods/bd6f012a-12a5-48ba-95b8-9bd09cb71050/volumes" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.277874 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-s8hw9_ed10a250-1c0b-4fc4-9906-6e01dba78a1e/manager/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.347583 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-qdxqw_3b478830-010a-408c-9eb4-0eaa51f75c31/manager/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.357992 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-qdxqw_3b478830-010a-408c-9eb4-0eaa51f75c31/kube-rbac-proxy/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.521596 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-bdx4w_5c85c0be-894b-4469-820c-35cac2b32905/kube-rbac-proxy/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.675028 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-h6gp4_8bba42dc-f728-4066-a83c-632c7dfd4502/kube-rbac-proxy/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.693777 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-bdx4w_5c85c0be-894b-4469-820c-35cac2b32905/manager/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.720154 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-h6gp4_8bba42dc-f728-4066-a83c-632c7dfd4502/manager/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.858540 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-s8527_8031228c-d653-49ea-aa71-90709d299152/kube-rbac-proxy/0.log" Oct 06 14:05:22 crc kubenswrapper[4892]: I1006 14:05:22.918584 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-s8527_8031228c-d653-49ea-aa71-90709d299152/manager/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.013845 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-cfcl2_7652cece-84f1-49ba-b99b-8e40047e7822/kube-rbac-proxy/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.037452 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-cfcl2_7652cece-84f1-49ba-b99b-8e40047e7822/manager/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.115869 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d_c71106dc-615a-4668-8485-8140171bd46e/kube-rbac-proxy/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.214424 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-qdj6d_c71106dc-615a-4668-8485-8140171bd46e/manager/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.275192 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-ls78t_d23bd511-3627-436d-bea3-abc434a7ecc7/kube-rbac-proxy/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.342549 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-ls78t_d23bd511-3627-436d-bea3-abc434a7ecc7/manager/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.411150 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-6ws6z_23c2345a-b95d-40e2-9d36-4affd79498e8/kube-rbac-proxy/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.533197 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-6ws6z_23c2345a-b95d-40e2-9d36-4affd79498e8/manager/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.602816 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-lvcfv_7e7d79cb-0957-4ef5-9b20-f82fd5d288d8/kube-rbac-proxy/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.644998 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-lvcfv_7e7d79cb-0957-4ef5-9b20-f82fd5d288d8/manager/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.741306 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k_1988af8e-3dcc-41b2-a044-03af6e6bc040/kube-rbac-proxy/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.787915 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665c9fg5k_1988af8e-3dcc-41b2-a044-03af6e6bc040/manager/0.log" Oct 06 14:05:23 crc kubenswrapper[4892]: I1006 14:05:23.934242 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5c6b9976b-dv6c2_dd70d284-2387-44ef-ad9c-eb725a2a283d/kube-rbac-proxy/0.log" Oct 06 14:05:24 crc kubenswrapper[4892]: I1006 14:05:24.136846 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-c85944558-7trks_883f9aa6-2f29-4f1b-b2f5-0581ab6853f9/kube-rbac-proxy/0.log" Oct 06 14:05:24 crc kubenswrapper[4892]: I1006 14:05:24.305914 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-c85944558-7trks_883f9aa6-2f29-4f1b-b2f5-0581ab6853f9/operator/0.log" Oct 06 14:05:24 crc kubenswrapper[4892]: I1006 14:05:24.345741 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-kdnq4_21f6370c-bc30-4e85-866b-d15376b5d6c8/registry-server/0.log" Oct 06 14:05:24 crc kubenswrapper[4892]: I1006 14:05:24.559419 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-7jqz4_94df4eb7-c559-4ce8-9deb-5da3a04bebb7/kube-rbac-proxy/0.log" Oct 06 14:05:24 crc kubenswrapper[4892]: I1006 14:05:24.640081 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-7jqz4_94df4eb7-c559-4ce8-9deb-5da3a04bebb7/manager/0.log" Oct 06 14:05:24 crc kubenswrapper[4892]: I1006 14:05:24.744660 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-q7kbv_32f9f85f-cb86-41f3-88c1-d891f5e67608/kube-rbac-proxy/0.log" Oct 06 14:05:24 crc kubenswrapper[4892]: I1006 14:05:24.809159 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-q7kbv_32f9f85f-cb86-41f3-88c1-d891f5e67608/manager/0.log" Oct 06 14:05:24 crc kubenswrapper[4892]: I1006 14:05:24.905766 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-s5vkk_2f1e9464-e868-4fb8-baee-7832454f4cd5/operator/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.074050 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-5gfhv_6a4055ea-2ed1-4de3-a975-850404b8d746/kube-rbac-proxy/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.143101 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-5gfhv_6a4055ea-2ed1-4de3-a975-850404b8d746/manager/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.180104 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5c6b9976b-dv6c2_dd70d284-2387-44ef-ad9c-eb725a2a283d/manager/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.236399 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-wht6n_304bfead-26e9-4a95-9c1d-8659de9b0546/kube-rbac-proxy/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.351828 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-z6qpz_e81e38e4-cf8a-4a93-8753-335c6c1aca6c/kube-rbac-proxy/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.386847 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-z6qpz_e81e38e4-cf8a-4a93-8753-335c6c1aca6c/manager/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.494403 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-wht6n_304bfead-26e9-4a95-9c1d-8659de9b0546/manager/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.565650 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5966748665-mwjwq_41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a/kube-rbac-proxy/0.log" Oct 06 14:05:25 crc kubenswrapper[4892]: I1006 14:05:25.654289 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5966748665-mwjwq_41ce0f01-b6e5-44b2-ae7a-28ea53b88e5a/manager/0.log" Oct 06 14:05:36 crc kubenswrapper[4892]: I1006 14:05:36.168486 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:05:36 crc kubenswrapper[4892]: E1006 14:05:36.169286 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:05:40 crc kubenswrapper[4892]: I1006 14:05:40.403900 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4bs2h_ab463181-efdc-4a78-b735-176516f4d185/control-plane-machine-set-operator/0.log" Oct 06 14:05:40 crc kubenswrapper[4892]: I1006 14:05:40.512092 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-knk7q_099dffdf-1bf9-451d-8248-d4104dcdf1b6/kube-rbac-proxy/0.log" Oct 06 14:05:40 crc kubenswrapper[4892]: I1006 14:05:40.566529 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-knk7q_099dffdf-1bf9-451d-8248-d4104dcdf1b6/machine-api-operator/0.log" Oct 06 14:05:48 crc kubenswrapper[4892]: I1006 14:05:48.169157 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:05:48 crc kubenswrapper[4892]: E1006 14:05:48.169991 4892 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4t26s_openshift-machine-config-operator(f0107ee8-a9e2-4a14-b044-1c37a9df4d38)\"" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" Oct 06 14:05:52 crc kubenswrapper[4892]: I1006 14:05:52.146437 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-86dnp_f92d7a66-86e9-4f49-9797-d0714a72e329/cert-manager-controller/0.log" Oct 06 14:05:52 crc kubenswrapper[4892]: I1006 14:05:52.311421 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-hz75l_248acf10-be69-4f77-8101-d6e3f8a454d6/cert-manager-cainjector/0.log" Oct 06 14:05:52 crc kubenswrapper[4892]: I1006 14:05:52.352656 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-h8rv5_8f1a8124-54fd-486e-90d9-dbe21bed30d8/cert-manager-webhook/0.log" Oct 06 14:06:01 crc kubenswrapper[4892]: I1006 14:06:01.169516 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:06:01 crc kubenswrapper[4892]: I1006 14:06:01.630815 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"5d7b8f214a648808daa002ce6bc877f9d218d66a4ae7d5e12d71ddc6449f01f3"} Oct 06 14:06:03 crc kubenswrapper[4892]: I1006 14:06:03.755019 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-6js6s_175bab6c-5586-459d-b101-2ca420eb7885/nmstate-console-plugin/0.log" Oct 06 14:06:03 crc kubenswrapper[4892]: I1006 14:06:03.890067 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8hgsn_32fdd543-59ce-4feb-bd9f-9d804de6f71a/nmstate-handler/0.log" Oct 06 14:06:03 crc kubenswrapper[4892]: I1006 14:06:03.907189 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-hrjsh_b33f5260-a8fd-4654-9a8c-49e30ed7857d/kube-rbac-proxy/0.log" Oct 06 14:06:03 crc kubenswrapper[4892]: I1006 14:06:03.926500 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-hrjsh_b33f5260-a8fd-4654-9a8c-49e30ed7857d/nmstate-metrics/0.log" Oct 06 14:06:04 crc kubenswrapper[4892]: I1006 14:06:04.096645 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-bqk58_f6c46ff4-2ed1-4acc-bae3-05a8db533ed7/nmstate-webhook/0.log" Oct 06 14:06:04 crc kubenswrapper[4892]: I1006 14:06:04.129055 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-xvhkn_3912b44b-2305-4f14-8b86-5a5208df2442/nmstate-operator/0.log" Oct 06 14:06:17 crc kubenswrapper[4892]: I1006 14:06:17.795840 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xnc9c_8309be58-c242-4799-a7db-ebb0171b23de/kube-rbac-proxy/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.020874 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-xnc9c_8309be58-c242-4799-a7db-ebb0171b23de/controller/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.036207 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-pcrl5_9f2d191f-6b44-4c22-bddd-53bd6237ba29/frr-k8s-webhook-server/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.192444 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-frr-files/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.381043 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-reloader/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.384930 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-frr-files/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.393831 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-metrics/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.406783 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-reloader/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.639956 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-frr-files/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.675314 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-metrics/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.677835 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-reloader/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.758509 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-metrics/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.910701 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-metrics/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.969860 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/controller/0.log" Oct 06 14:06:18 crc kubenswrapper[4892]: I1006 14:06:18.981074 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-frr-files/0.log" Oct 06 14:06:19 crc kubenswrapper[4892]: I1006 14:06:19.025972 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/cp-reloader/0.log" Oct 06 14:06:19 crc kubenswrapper[4892]: I1006 14:06:19.192881 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/frr-metrics/0.log" Oct 06 14:06:19 crc kubenswrapper[4892]: I1006 14:06:19.254661 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/kube-rbac-proxy/0.log" Oct 06 14:06:19 crc kubenswrapper[4892]: I1006 14:06:19.267115 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/kube-rbac-proxy-frr/0.log" Oct 06 14:06:19 crc kubenswrapper[4892]: I1006 14:06:19.436398 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/reloader/0.log" Oct 06 14:06:19 crc kubenswrapper[4892]: I1006 14:06:19.595637 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5dc65fffc5-fws96_0e90157b-0ee2-45ab-b457-e4dd396bfcf4/manager/0.log" Oct 06 14:06:19 crc kubenswrapper[4892]: I1006 14:06:19.780587 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-cf886c89f-mc4ms_3c11a8ae-6eac-4709-9c88-aa7048e7bb08/webhook-server/0.log" Oct 06 14:06:19 crc kubenswrapper[4892]: I1006 14:06:19.863128 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ffpm9_619837d2-8e2d-42d7-a34d-a3c1e39d213b/kube-rbac-proxy/0.log" Oct 06 14:06:20 crc kubenswrapper[4892]: I1006 14:06:20.761255 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-ffpm9_619837d2-8e2d-42d7-a34d-a3c1e39d213b/speaker/0.log" Oct 06 14:06:21 crc kubenswrapper[4892]: I1006 14:06:21.162205 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-zqfvv_344945cb-67e7-4600-a300-676dcddc3659/frr/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.179781 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp_b00a505d-365a-4d52-b900-33073e3b4e84/util/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.366139 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp_b00a505d-365a-4d52-b900-33073e3b4e84/util/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.374794 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp_b00a505d-365a-4d52-b900-33073e3b4e84/pull/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.392083 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp_b00a505d-365a-4d52-b900-33073e3b4e84/pull/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.576155 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp_b00a505d-365a-4d52-b900-33073e3b4e84/extract/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.596841 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp_b00a505d-365a-4d52-b900-33073e3b4e84/util/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.604370 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2f2fbp_b00a505d-365a-4d52-b900-33073e3b4e84/pull/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.758565 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24/util/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.924077 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24/pull/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.957069 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24/util/0.log" Oct 06 14:06:33 crc kubenswrapper[4892]: I1006 14:06:33.963413 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24/pull/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.273027 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24/util/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.283025 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24/pull/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.339450 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2d2bvt8_b0e4cf48-a27b-4fa1-89ba-efeb6dc3bf24/extract/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.467255 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q8fjn_44a5fd23-14c3-4215-a22c-9111e7a1c591/extract-utilities/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.650560 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q8fjn_44a5fd23-14c3-4215-a22c-9111e7a1c591/extract-content/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.672398 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q8fjn_44a5fd23-14c3-4215-a22c-9111e7a1c591/extract-content/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.702665 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q8fjn_44a5fd23-14c3-4215-a22c-9111e7a1c591/extract-utilities/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.911239 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q8fjn_44a5fd23-14c3-4215-a22c-9111e7a1c591/extract-utilities/0.log" Oct 06 14:06:34 crc kubenswrapper[4892]: I1006 14:06:34.913502 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q8fjn_44a5fd23-14c3-4215-a22c-9111e7a1c591/extract-content/0.log" Oct 06 14:06:35 crc kubenswrapper[4892]: I1006 14:06:35.177756 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kslsb_cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3/extract-utilities/0.log" Oct 06 14:06:35 crc kubenswrapper[4892]: I1006 14:06:35.386304 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kslsb_cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3/extract-utilities/0.log" Oct 06 14:06:35 crc kubenswrapper[4892]: I1006 14:06:35.402402 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kslsb_cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3/extract-content/0.log" Oct 06 14:06:35 crc kubenswrapper[4892]: I1006 14:06:35.432076 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kslsb_cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3/extract-content/0.log" Oct 06 14:06:35 crc kubenswrapper[4892]: I1006 14:06:35.590537 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kslsb_cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3/extract-content/0.log" Oct 06 14:06:35 crc kubenswrapper[4892]: I1006 14:06:35.695998 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q8fjn_44a5fd23-14c3-4215-a22c-9111e7a1c591/registry-server/0.log" Oct 06 14:06:35 crc kubenswrapper[4892]: I1006 14:06:35.725100 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kslsb_cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3/extract-utilities/0.log" Oct 06 14:06:35 crc kubenswrapper[4892]: I1006 14:06:35.929947 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd_103c43a1-da8f-47d5-a72d-2f97e8bbae27/util/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.262279 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd_103c43a1-da8f-47d5-a72d-2f97e8bbae27/util/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.263280 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd_103c43a1-da8f-47d5-a72d-2f97e8bbae27/pull/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.295993 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd_103c43a1-da8f-47d5-a72d-2f97e8bbae27/pull/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.432922 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd_103c43a1-da8f-47d5-a72d-2f97e8bbae27/util/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.574576 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd_103c43a1-da8f-47d5-a72d-2f97e8bbae27/extract/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.580653 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ch5fxd_103c43a1-da8f-47d5-a72d-2f97e8bbae27/pull/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.646586 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-kslsb_cd5f0e41-d114-4ef9-a0ab-ad3dbf5cd8c3/registry-server/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.744996 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bdjg5_df6c1edd-eb9c-4b9d-a557-9dfa585c8a8a/marketplace-operator/0.log" Oct 06 14:06:36 crc kubenswrapper[4892]: I1006 14:06:36.873984 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j2hm7_1bed95b1-6340-43d9-88be-140829a9a0ab/extract-utilities/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.064404 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j2hm7_1bed95b1-6340-43d9-88be-140829a9a0ab/extract-utilities/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.095951 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j2hm7_1bed95b1-6340-43d9-88be-140829a9a0ab/extract-content/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.126922 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j2hm7_1bed95b1-6340-43d9-88be-140829a9a0ab/extract-content/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.302101 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j2hm7_1bed95b1-6340-43d9-88be-140829a9a0ab/extract-utilities/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.302194 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j2hm7_1bed95b1-6340-43d9-88be-140829a9a0ab/extract-content/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.425754 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8n2m6_55ef17bc-6b08-450b-947d-1e3c5eb5f806/extract-utilities/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.520676 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j2hm7_1bed95b1-6340-43d9-88be-140829a9a0ab/registry-server/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.649377 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8n2m6_55ef17bc-6b08-450b-947d-1e3c5eb5f806/extract-utilities/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.667748 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8n2m6_55ef17bc-6b08-450b-947d-1e3c5eb5f806/extract-content/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.671427 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8n2m6_55ef17bc-6b08-450b-947d-1e3c5eb5f806/extract-content/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.846253 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8n2m6_55ef17bc-6b08-450b-947d-1e3c5eb5f806/extract-content/0.log" Oct 06 14:06:37 crc kubenswrapper[4892]: I1006 14:06:37.858552 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8n2m6_55ef17bc-6b08-450b-947d-1e3c5eb5f806/extract-utilities/0.log" Oct 06 14:06:38 crc kubenswrapper[4892]: I1006 14:06:38.696911 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8n2m6_55ef17bc-6b08-450b-947d-1e3c5eb5f806/registry-server/0.log" Oct 06 14:06:50 crc kubenswrapper[4892]: I1006 14:06:50.143086 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-8fxfv_aae867b4-2097-459f-a413-0ead7e4478ce/prometheus-operator/0.log" Oct 06 14:06:50 crc kubenswrapper[4892]: I1006 14:06:50.304034 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6dccdf64d8-8bpxd_47ebb6a8-d47f-4d58-a7f2-e02ebf2b07f6/prometheus-operator-admission-webhook/0.log" Oct 06 14:06:50 crc kubenswrapper[4892]: I1006 14:06:50.390564 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6dccdf64d8-fnqsx_6a0093d1-7a06-4147-816d-4bc7a73c505d/prometheus-operator-admission-webhook/0.log" Oct 06 14:06:50 crc kubenswrapper[4892]: I1006 14:06:50.508184 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-4dkvd_178f722d-bf92-4584-b0cf-9550c41b3153/operator/0.log" Oct 06 14:06:50 crc kubenswrapper[4892]: I1006 14:06:50.585259 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-gtbwr_aef96bb9-192a-4934-b46d-2e2cea0ac97e/perses-operator/0.log" Oct 06 14:08:22 crc kubenswrapper[4892]: I1006 14:08:22.984702 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:08:22 crc kubenswrapper[4892]: I1006 14:08:22.987081 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:08:35 crc kubenswrapper[4892]: I1006 14:08:35.802139 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zx7d7"] Oct 06 14:08:35 crc kubenswrapper[4892]: E1006 14:08:35.804712 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd6f012a-12a5-48ba-95b8-9bd09cb71050" containerName="container-00" Oct 06 14:08:35 crc kubenswrapper[4892]: I1006 14:08:35.804743 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6f012a-12a5-48ba-95b8-9bd09cb71050" containerName="container-00" Oct 06 14:08:35 crc kubenswrapper[4892]: I1006 14:08:35.805005 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd6f012a-12a5-48ba-95b8-9bd09cb71050" containerName="container-00" Oct 06 14:08:35 crc kubenswrapper[4892]: I1006 14:08:35.806942 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:35 crc kubenswrapper[4892]: I1006 14:08:35.833379 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zx7d7"] Oct 06 14:08:35 crc kubenswrapper[4892]: I1006 14:08:35.897459 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-utilities\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:35 crc kubenswrapper[4892]: I1006 14:08:35.897576 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-catalog-content\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:35 crc kubenswrapper[4892]: I1006 14:08:35.897601 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tdgv\" (UniqueName: \"kubernetes.io/projected/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-kube-api-access-5tdgv\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:36 crc kubenswrapper[4892]: I1006 14:08:35.999996 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-catalog-content\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:36 crc kubenswrapper[4892]: I1006 14:08:36.000254 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tdgv\" (UniqueName: \"kubernetes.io/projected/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-kube-api-access-5tdgv\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:36 crc kubenswrapper[4892]: I1006 14:08:36.000483 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-utilities\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:36 crc kubenswrapper[4892]: I1006 14:08:36.000952 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-catalog-content\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:36 crc kubenswrapper[4892]: I1006 14:08:36.001396 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-utilities\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:36 crc kubenswrapper[4892]: I1006 14:08:36.024774 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tdgv\" (UniqueName: \"kubernetes.io/projected/182d7e4c-42b8-4ef2-ba3e-76278e9c57af-kube-api-access-5tdgv\") pod \"redhat-operators-zx7d7\" (UID: \"182d7e4c-42b8-4ef2-ba3e-76278e9c57af\") " pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:36 crc kubenswrapper[4892]: I1006 14:08:36.128769 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:36 crc kubenswrapper[4892]: W1006 14:08:36.636411 4892 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod182d7e4c_42b8_4ef2_ba3e_76278e9c57af.slice/crio-7f76d88a8d5aa023f5ddb14f2c6c5afcc01fc7693cffeda0f85e5d5d108fc6ef WatchSource:0}: Error finding container 7f76d88a8d5aa023f5ddb14f2c6c5afcc01fc7693cffeda0f85e5d5d108fc6ef: Status 404 returned error can't find the container with id 7f76d88a8d5aa023f5ddb14f2c6c5afcc01fc7693cffeda0f85e5d5d108fc6ef Oct 06 14:08:36 crc kubenswrapper[4892]: I1006 14:08:36.661981 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zx7d7"] Oct 06 14:08:37 crc kubenswrapper[4892]: I1006 14:08:37.321098 4892 generic.go:334] "Generic (PLEG): container finished" podID="182d7e4c-42b8-4ef2-ba3e-76278e9c57af" containerID="0ed94c64bd559ba4727b1a9048b1d17df1f17b5c1a041ead8eee9b94ec5aa5dc" exitCode=0 Oct 06 14:08:37 crc kubenswrapper[4892]: I1006 14:08:37.321144 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7d7" event={"ID":"182d7e4c-42b8-4ef2-ba3e-76278e9c57af","Type":"ContainerDied","Data":"0ed94c64bd559ba4727b1a9048b1d17df1f17b5c1a041ead8eee9b94ec5aa5dc"} Oct 06 14:08:37 crc kubenswrapper[4892]: I1006 14:08:37.321473 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7d7" event={"ID":"182d7e4c-42b8-4ef2-ba3e-76278e9c57af","Type":"ContainerStarted","Data":"7f76d88a8d5aa023f5ddb14f2c6c5afcc01fc7693cffeda0f85e5d5d108fc6ef"} Oct 06 14:08:37 crc kubenswrapper[4892]: I1006 14:08:37.323397 4892 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 06 14:08:49 crc kubenswrapper[4892]: I1006 14:08:49.456853 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7d7" event={"ID":"182d7e4c-42b8-4ef2-ba3e-76278e9c57af","Type":"ContainerStarted","Data":"63b5534390f60b915596d75107d4bda7bb159fdb92d600ccb3af87e27d9865ea"} Oct 06 14:08:51 crc kubenswrapper[4892]: I1006 14:08:51.482378 4892 generic.go:334] "Generic (PLEG): container finished" podID="182d7e4c-42b8-4ef2-ba3e-76278e9c57af" containerID="63b5534390f60b915596d75107d4bda7bb159fdb92d600ccb3af87e27d9865ea" exitCode=0 Oct 06 14:08:51 crc kubenswrapper[4892]: I1006 14:08:51.482517 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7d7" event={"ID":"182d7e4c-42b8-4ef2-ba3e-76278e9c57af","Type":"ContainerDied","Data":"63b5534390f60b915596d75107d4bda7bb159fdb92d600ccb3af87e27d9865ea"} Oct 06 14:08:52 crc kubenswrapper[4892]: I1006 14:08:52.493133 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zx7d7" event={"ID":"182d7e4c-42b8-4ef2-ba3e-76278e9c57af","Type":"ContainerStarted","Data":"ab5fc970efcb6eb38142d83944f6bcbbb72f5b451ed5b4c3fa5a8fbbd1d38bca"} Oct 06 14:08:52 crc kubenswrapper[4892]: I1006 14:08:52.510144 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zx7d7" podStartSLOduration=2.704768369 podStartE2EDuration="17.510123271s" podCreationTimestamp="2025-10-06 14:08:35 +0000 UTC" firstStartedPulling="2025-10-06 14:08:37.323089395 +0000 UTC m=+7203.872795160" lastFinishedPulling="2025-10-06 14:08:52.128444267 +0000 UTC m=+7218.678150062" observedRunningTime="2025-10-06 14:08:52.50728343 +0000 UTC m=+7219.056989205" watchObservedRunningTime="2025-10-06 14:08:52.510123271 +0000 UTC m=+7219.059829046" Oct 06 14:08:52 crc kubenswrapper[4892]: I1006 14:08:52.984267 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:08:52 crc kubenswrapper[4892]: I1006 14:08:52.984626 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:08:56 crc kubenswrapper[4892]: I1006 14:08:56.130832 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:56 crc kubenswrapper[4892]: I1006 14:08:56.131093 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:08:57 crc kubenswrapper[4892]: I1006 14:08:57.184238 4892 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zx7d7" podUID="182d7e4c-42b8-4ef2-ba3e-76278e9c57af" containerName="registry-server" probeResult="failure" output=< Oct 06 14:08:57 crc kubenswrapper[4892]: timeout: failed to connect service ":50051" within 1s Oct 06 14:08:57 crc kubenswrapper[4892]: > Oct 06 14:09:05 crc kubenswrapper[4892]: I1006 14:09:05.122111 4892 scope.go:117] "RemoveContainer" containerID="b686b1df9f7a3104fb59640b473aab6d49948982d1bed01e1babc271e1552a5c" Oct 06 14:09:05 crc kubenswrapper[4892]: I1006 14:09:05.644502 4892 generic.go:334] "Generic (PLEG): container finished" podID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerID="1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055" exitCode=0 Oct 06 14:09:05 crc kubenswrapper[4892]: I1006 14:09:05.644623 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ktz9n/must-gather-rthjm" event={"ID":"0e5e4441-6007-4f81-8b18-679b18dd08f0","Type":"ContainerDied","Data":"1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055"} Oct 06 14:09:05 crc kubenswrapper[4892]: I1006 14:09:05.645681 4892 scope.go:117] "RemoveContainer" containerID="1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055" Oct 06 14:09:06 crc kubenswrapper[4892]: I1006 14:09:06.204044 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:09:06 crc kubenswrapper[4892]: I1006 14:09:06.276759 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zx7d7" Oct 06 14:09:06 crc kubenswrapper[4892]: I1006 14:09:06.279170 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ktz9n_must-gather-rthjm_0e5e4441-6007-4f81-8b18-679b18dd08f0/gather/0.log" Oct 06 14:09:06 crc kubenswrapper[4892]: I1006 14:09:06.831360 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zx7d7"] Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.008074 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8n2m6"] Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.008640 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8n2m6" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerName="registry-server" containerID="cri-o://9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609" gracePeriod=2 Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.626020 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.688735 4892 generic.go:334] "Generic (PLEG): container finished" podID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerID="9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609" exitCode=0 Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.688836 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8n2m6" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.688904 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n2m6" event={"ID":"55ef17bc-6b08-450b-947d-1e3c5eb5f806","Type":"ContainerDied","Data":"9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609"} Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.688943 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8n2m6" event={"ID":"55ef17bc-6b08-450b-947d-1e3c5eb5f806","Type":"ContainerDied","Data":"f4e8ac7f20926177433e7110fbe7b6b2bcedd0d308b28a19d29c1bbfc454e7b5"} Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.688969 4892 scope.go:117] "RemoveContainer" containerID="9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.730723 4892 scope.go:117] "RemoveContainer" containerID="48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.753038 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbp75\" (UniqueName: \"kubernetes.io/projected/55ef17bc-6b08-450b-947d-1e3c5eb5f806-kube-api-access-hbp75\") pod \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.753108 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-utilities\") pod \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.753130 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-catalog-content\") pod \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\" (UID: \"55ef17bc-6b08-450b-947d-1e3c5eb5f806\") " Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.754843 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-utilities" (OuterVolumeSpecName: "utilities") pod "55ef17bc-6b08-450b-947d-1e3c5eb5f806" (UID: "55ef17bc-6b08-450b-947d-1e3c5eb5f806"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.768496 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ef17bc-6b08-450b-947d-1e3c5eb5f806-kube-api-access-hbp75" (OuterVolumeSpecName: "kube-api-access-hbp75") pod "55ef17bc-6b08-450b-947d-1e3c5eb5f806" (UID: "55ef17bc-6b08-450b-947d-1e3c5eb5f806"). InnerVolumeSpecName "kube-api-access-hbp75". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.774945 4892 scope.go:117] "RemoveContainer" containerID="fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.878733 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbp75\" (UniqueName: \"kubernetes.io/projected/55ef17bc-6b08-450b-947d-1e3c5eb5f806-kube-api-access-hbp75\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.878780 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.904814 4892 scope.go:117] "RemoveContainer" containerID="9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609" Oct 06 14:09:07 crc kubenswrapper[4892]: E1006 14:09:07.921501 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609\": container with ID starting with 9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609 not found: ID does not exist" containerID="9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.921579 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609"} err="failed to get container status \"9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609\": rpc error: code = NotFound desc = could not find container \"9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609\": container with ID starting with 9661875334b5e77e13e9aed59382c78403076128c4c00e43c15ce5a9ccfd2609 not found: ID does not exist" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.921607 4892 scope.go:117] "RemoveContainer" containerID="48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e" Oct 06 14:09:07 crc kubenswrapper[4892]: E1006 14:09:07.930753 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e\": container with ID starting with 48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e not found: ID does not exist" containerID="48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.930812 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e"} err="failed to get container status \"48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e\": rpc error: code = NotFound desc = could not find container \"48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e\": container with ID starting with 48fc3ed5a8e1e5d2c93a6a492da2b1e260ab2c7adfd227c2934d259e76d2d37e not found: ID does not exist" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.930865 4892 scope.go:117] "RemoveContainer" containerID="fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796" Oct 06 14:09:07 crc kubenswrapper[4892]: E1006 14:09:07.931620 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796\": container with ID starting with fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796 not found: ID does not exist" containerID="fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.931674 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796"} err="failed to get container status \"fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796\": rpc error: code = NotFound desc = could not find container \"fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796\": container with ID starting with fa5a2b71783880c382818806f27682047908725c992a55c78bb80487aa551796 not found: ID does not exist" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.954304 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55ef17bc-6b08-450b-947d-1e3c5eb5f806" (UID: "55ef17bc-6b08-450b-947d-1e3c5eb5f806"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:07 crc kubenswrapper[4892]: I1006 14:09:07.980742 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ef17bc-6b08-450b-947d-1e3c5eb5f806-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:08 crc kubenswrapper[4892]: I1006 14:09:08.032802 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8n2m6"] Oct 06 14:09:08 crc kubenswrapper[4892]: I1006 14:09:08.041956 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8n2m6"] Oct 06 14:09:08 crc kubenswrapper[4892]: I1006 14:09:08.182399 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" path="/var/lib/kubelet/pods/55ef17bc-6b08-450b-947d-1e3c5eb5f806/volumes" Oct 06 14:09:15 crc kubenswrapper[4892]: I1006 14:09:15.901815 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ktz9n/must-gather-rthjm"] Oct 06 14:09:15 crc kubenswrapper[4892]: I1006 14:09:15.902707 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ktz9n/must-gather-rthjm" podUID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerName="copy" containerID="cri-o://81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635" gracePeriod=2 Oct 06 14:09:15 crc kubenswrapper[4892]: I1006 14:09:15.913345 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ktz9n/must-gather-rthjm"] Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.351840 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ktz9n_must-gather-rthjm_0e5e4441-6007-4f81-8b18-679b18dd08f0/copy/0.log" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.353157 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.365203 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e5e4441-6007-4f81-8b18-679b18dd08f0-must-gather-output\") pod \"0e5e4441-6007-4f81-8b18-679b18dd08f0\" (UID: \"0e5e4441-6007-4f81-8b18-679b18dd08f0\") " Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.365248 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k444g\" (UniqueName: \"kubernetes.io/projected/0e5e4441-6007-4f81-8b18-679b18dd08f0-kube-api-access-k444g\") pod \"0e5e4441-6007-4f81-8b18-679b18dd08f0\" (UID: \"0e5e4441-6007-4f81-8b18-679b18dd08f0\") " Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.372744 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e5e4441-6007-4f81-8b18-679b18dd08f0-kube-api-access-k444g" (OuterVolumeSpecName: "kube-api-access-k444g") pod "0e5e4441-6007-4f81-8b18-679b18dd08f0" (UID: "0e5e4441-6007-4f81-8b18-679b18dd08f0"). InnerVolumeSpecName "kube-api-access-k444g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.467453 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k444g\" (UniqueName: \"kubernetes.io/projected/0e5e4441-6007-4f81-8b18-679b18dd08f0-kube-api-access-k444g\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.569898 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e5e4441-6007-4f81-8b18-679b18dd08f0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0e5e4441-6007-4f81-8b18-679b18dd08f0" (UID: "0e5e4441-6007-4f81-8b18-679b18dd08f0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.670217 4892 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0e5e4441-6007-4f81-8b18-679b18dd08f0-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.781384 4892 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ktz9n_must-gather-rthjm_0e5e4441-6007-4f81-8b18-679b18dd08f0/copy/0.log" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.781791 4892 generic.go:334] "Generic (PLEG): container finished" podID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerID="81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635" exitCode=143 Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.781854 4892 scope.go:117] "RemoveContainer" containerID="81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.781869 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ktz9n/must-gather-rthjm" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.800703 4892 scope.go:117] "RemoveContainer" containerID="1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.877625 4892 scope.go:117] "RemoveContainer" containerID="81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635" Oct 06 14:09:16 crc kubenswrapper[4892]: E1006 14:09:16.878047 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635\": container with ID starting with 81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635 not found: ID does not exist" containerID="81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.878079 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635"} err="failed to get container status \"81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635\": rpc error: code = NotFound desc = could not find container \"81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635\": container with ID starting with 81850ed504be3ffc6f781a9f2b965c2e41edfd2cf75b4a1995433e8383433635 not found: ID does not exist" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.878100 4892 scope.go:117] "RemoveContainer" containerID="1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055" Oct 06 14:09:16 crc kubenswrapper[4892]: E1006 14:09:16.878286 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055\": container with ID starting with 1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055 not found: ID does not exist" containerID="1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055" Oct 06 14:09:16 crc kubenswrapper[4892]: I1006 14:09:16.878318 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055"} err="failed to get container status \"1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055\": rpc error: code = NotFound desc = could not find container \"1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055\": container with ID starting with 1679b32c3923b0264f6c91926d293de7f197c10f23b587e5b636499e1a834055 not found: ID does not exist" Oct 06 14:09:18 crc kubenswrapper[4892]: I1006 14:09:18.180606 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e5e4441-6007-4f81-8b18-679b18dd08f0" path="/var/lib/kubelet/pods/0e5e4441-6007-4f81-8b18-679b18dd08f0/volumes" Oct 06 14:09:22 crc kubenswrapper[4892]: I1006 14:09:22.984809 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:09:22 crc kubenswrapper[4892]: I1006 14:09:22.985236 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 06 14:09:22 crc kubenswrapper[4892]: I1006 14:09:22.985286 4892 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" Oct 06 14:09:22 crc kubenswrapper[4892]: I1006 14:09:22.987540 4892 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5d7b8f214a648808daa002ce6bc877f9d218d66a4ae7d5e12d71ddc6449f01f3"} pod="openshift-machine-config-operator/machine-config-daemon-4t26s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 06 14:09:22 crc kubenswrapper[4892]: I1006 14:09:22.987644 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" containerID="cri-o://5d7b8f214a648808daa002ce6bc877f9d218d66a4ae7d5e12d71ddc6449f01f3" gracePeriod=600 Oct 06 14:09:23 crc kubenswrapper[4892]: I1006 14:09:23.878438 4892 generic.go:334] "Generic (PLEG): container finished" podID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerID="5d7b8f214a648808daa002ce6bc877f9d218d66a4ae7d5e12d71ddc6449f01f3" exitCode=0 Oct 06 14:09:23 crc kubenswrapper[4892]: I1006 14:09:23.879233 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerDied","Data":"5d7b8f214a648808daa002ce6bc877f9d218d66a4ae7d5e12d71ddc6449f01f3"} Oct 06 14:09:23 crc kubenswrapper[4892]: I1006 14:09:23.879360 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" event={"ID":"f0107ee8-a9e2-4a14-b044-1c37a9df4d38","Type":"ContainerStarted","Data":"686c43217f8245f21ef432b161b32ebe0027f7c5c068a368b57056a3d8762fc2"} Oct 06 14:09:23 crc kubenswrapper[4892]: I1006 14:09:23.879499 4892 scope.go:117] "RemoveContainer" containerID="94c51515e0e60ce5d35b32d09693e41f9f8c6a04ad578f88c920637978db2948" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.924887 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d52nd"] Oct 06 14:09:28 crc kubenswrapper[4892]: E1006 14:09:28.926027 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerName="gather" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.926044 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerName="gather" Oct 06 14:09:28 crc kubenswrapper[4892]: E1006 14:09:28.926067 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerName="extract-utilities" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.926075 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerName="extract-utilities" Oct 06 14:09:28 crc kubenswrapper[4892]: E1006 14:09:28.926107 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerName="extract-content" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.926116 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerName="extract-content" Oct 06 14:09:28 crc kubenswrapper[4892]: E1006 14:09:28.926140 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerName="registry-server" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.926149 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerName="registry-server" Oct 06 14:09:28 crc kubenswrapper[4892]: E1006 14:09:28.926166 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerName="copy" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.926174 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerName="copy" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.926506 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ef17bc-6b08-450b-947d-1e3c5eb5f806" containerName="registry-server" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.926535 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerName="copy" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.926549 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e5e4441-6007-4f81-8b18-679b18dd08f0" containerName="gather" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.928549 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:28 crc kubenswrapper[4892]: I1006 14:09:28.935723 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d52nd"] Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.043422 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-catalog-content\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.043903 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-utilities\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.044490 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9s6c\" (UniqueName: \"kubernetes.io/projected/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-kube-api-access-d9s6c\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.146962 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-catalog-content\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.147008 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-utilities\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.147148 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9s6c\" (UniqueName: \"kubernetes.io/projected/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-kube-api-access-d9s6c\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.147578 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-catalog-content\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.148268 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-utilities\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.171648 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9s6c\" (UniqueName: \"kubernetes.io/projected/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-kube-api-access-d9s6c\") pod \"redhat-marketplace-d52nd\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.267715 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.803365 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d52nd"] Oct 06 14:09:29 crc kubenswrapper[4892]: I1006 14:09:29.975004 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d52nd" event={"ID":"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4","Type":"ContainerStarted","Data":"d6cc2cde892dec6e56b668fcd4500d64a496f520a6e4fcc1cb6e976973829202"} Oct 06 14:09:30 crc kubenswrapper[4892]: I1006 14:09:30.996557 4892 generic.go:334] "Generic (PLEG): container finished" podID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerID="27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516" exitCode=0 Oct 06 14:09:30 crc kubenswrapper[4892]: I1006 14:09:30.997065 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d52nd" event={"ID":"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4","Type":"ContainerDied","Data":"27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516"} Oct 06 14:09:33 crc kubenswrapper[4892]: I1006 14:09:33.017650 4892 generic.go:334] "Generic (PLEG): container finished" podID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerID="ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47" exitCode=0 Oct 06 14:09:33 crc kubenswrapper[4892]: I1006 14:09:33.017750 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d52nd" event={"ID":"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4","Type":"ContainerDied","Data":"ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47"} Oct 06 14:09:34 crc kubenswrapper[4892]: I1006 14:09:34.029979 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d52nd" event={"ID":"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4","Type":"ContainerStarted","Data":"02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0"} Oct 06 14:09:39 crc kubenswrapper[4892]: I1006 14:09:39.268566 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:39 crc kubenswrapper[4892]: I1006 14:09:39.269105 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:39 crc kubenswrapper[4892]: I1006 14:09:39.348987 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:39 crc kubenswrapper[4892]: I1006 14:09:39.385437 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d52nd" podStartSLOduration=8.879428023 podStartE2EDuration="11.385413105s" podCreationTimestamp="2025-10-06 14:09:28 +0000 UTC" firstStartedPulling="2025-10-06 14:09:30.999963437 +0000 UTC m=+7257.549669212" lastFinishedPulling="2025-10-06 14:09:33.505948519 +0000 UTC m=+7260.055654294" observedRunningTime="2025-10-06 14:09:34.085780731 +0000 UTC m=+7260.635486506" watchObservedRunningTime="2025-10-06 14:09:39.385413105 +0000 UTC m=+7265.935118890" Oct 06 14:09:40 crc kubenswrapper[4892]: I1006 14:09:40.181310 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:40 crc kubenswrapper[4892]: I1006 14:09:40.232207 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d52nd"] Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.134726 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d52nd" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerName="registry-server" containerID="cri-o://02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0" gracePeriod=2 Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.573712 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.743019 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-catalog-content\") pod \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.743653 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9s6c\" (UniqueName: \"kubernetes.io/projected/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-kube-api-access-d9s6c\") pod \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.743725 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-utilities\") pod \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\" (UID: \"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4\") " Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.744813 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-utilities" (OuterVolumeSpecName: "utilities") pod "cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" (UID: "cf0d7f3e-763f-4064-bd97-a0098f4d0ba4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.752714 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-kube-api-access-d9s6c" (OuterVolumeSpecName: "kube-api-access-d9s6c") pod "cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" (UID: "cf0d7f3e-763f-4064-bd97-a0098f4d0ba4"). InnerVolumeSpecName "kube-api-access-d9s6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.764008 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" (UID: "cf0d7f3e-763f-4064-bd97-a0098f4d0ba4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.846854 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.846900 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9s6c\" (UniqueName: \"kubernetes.io/projected/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-kube-api-access-d9s6c\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:42 crc kubenswrapper[4892]: I1006 14:09:42.846915 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.146471 4892 generic.go:334] "Generic (PLEG): container finished" podID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerID="02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0" exitCode=0 Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.146631 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d52nd" event={"ID":"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4","Type":"ContainerDied","Data":"02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0"} Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.147762 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d52nd" event={"ID":"cf0d7f3e-763f-4064-bd97-a0098f4d0ba4","Type":"ContainerDied","Data":"d6cc2cde892dec6e56b668fcd4500d64a496f520a6e4fcc1cb6e976973829202"} Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.146703 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d52nd" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.147871 4892 scope.go:117] "RemoveContainer" containerID="02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.179967 4892 scope.go:117] "RemoveContainer" containerID="ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.185594 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d52nd"] Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.194598 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d52nd"] Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.225655 4892 scope.go:117] "RemoveContainer" containerID="27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.254701 4892 scope.go:117] "RemoveContainer" containerID="02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0" Oct 06 14:09:43 crc kubenswrapper[4892]: E1006 14:09:43.255340 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0\": container with ID starting with 02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0 not found: ID does not exist" containerID="02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.255383 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0"} err="failed to get container status \"02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0\": rpc error: code = NotFound desc = could not find container \"02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0\": container with ID starting with 02ed948907a5d9d66747a5a2537aabd9b7fe10841817cd3dbee1ca465d0fa3b0 not found: ID does not exist" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.255412 4892 scope.go:117] "RemoveContainer" containerID="ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47" Oct 06 14:09:43 crc kubenswrapper[4892]: E1006 14:09:43.255868 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47\": container with ID starting with ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47 not found: ID does not exist" containerID="ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.255925 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47"} err="failed to get container status \"ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47\": rpc error: code = NotFound desc = could not find container \"ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47\": container with ID starting with ea73144f9a6c8cb5e8a0364806e0a3681571d8c8513a556164aeaa974f894e47 not found: ID does not exist" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.255958 4892 scope.go:117] "RemoveContainer" containerID="27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516" Oct 06 14:09:43 crc kubenswrapper[4892]: E1006 14:09:43.256539 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516\": container with ID starting with 27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516 not found: ID does not exist" containerID="27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516" Oct 06 14:09:43 crc kubenswrapper[4892]: I1006 14:09:43.256580 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516"} err="failed to get container status \"27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516\": rpc error: code = NotFound desc = could not find container \"27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516\": container with ID starting with 27f5400223268b1c9bb612e5c9657cee5725634b55fe3e8502da90613e115516 not found: ID does not exist" Oct 06 14:09:44 crc kubenswrapper[4892]: I1006 14:09:44.182866 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" path="/var/lib/kubelet/pods/cf0d7f3e-763f-4064-bd97-a0098f4d0ba4/volumes" Oct 06 14:10:19 crc kubenswrapper[4892]: I1006 14:10:19.875137 4892 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qpm4t"] Oct 06 14:10:19 crc kubenswrapper[4892]: E1006 14:10:19.876195 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerName="extract-utilities" Oct 06 14:10:19 crc kubenswrapper[4892]: I1006 14:10:19.876211 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerName="extract-utilities" Oct 06 14:10:19 crc kubenswrapper[4892]: E1006 14:10:19.876230 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerName="registry-server" Oct 06 14:10:19 crc kubenswrapper[4892]: I1006 14:10:19.876236 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerName="registry-server" Oct 06 14:10:19 crc kubenswrapper[4892]: E1006 14:10:19.876259 4892 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerName="extract-content" Oct 06 14:10:19 crc kubenswrapper[4892]: I1006 14:10:19.876265 4892 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerName="extract-content" Oct 06 14:10:19 crc kubenswrapper[4892]: I1006 14:10:19.876495 4892 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf0d7f3e-763f-4064-bd97-a0098f4d0ba4" containerName="registry-server" Oct 06 14:10:19 crc kubenswrapper[4892]: I1006 14:10:19.877941 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:19 crc kubenswrapper[4892]: I1006 14:10:19.887210 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpm4t"] Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.005997 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-catalog-content\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.006116 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz45x\" (UniqueName: \"kubernetes.io/projected/74160f65-72f0-4ec1-87e6-58fda7e28896-kube-api-access-kz45x\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.006196 4892 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-utilities\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.108053 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz45x\" (UniqueName: \"kubernetes.io/projected/74160f65-72f0-4ec1-87e6-58fda7e28896-kube-api-access-kz45x\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.108125 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-utilities\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.108222 4892 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-catalog-content\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.108651 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-catalog-content\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.109144 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-utilities\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.127312 4892 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz45x\" (UniqueName: \"kubernetes.io/projected/74160f65-72f0-4ec1-87e6-58fda7e28896-kube-api-access-kz45x\") pod \"certified-operators-qpm4t\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.195406 4892 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:20 crc kubenswrapper[4892]: I1006 14:10:20.789191 4892 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpm4t"] Oct 06 14:10:21 crc kubenswrapper[4892]: I1006 14:10:21.538734 4892 generic.go:334] "Generic (PLEG): container finished" podID="74160f65-72f0-4ec1-87e6-58fda7e28896" containerID="c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f" exitCode=0 Oct 06 14:10:21 crc kubenswrapper[4892]: I1006 14:10:21.538908 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpm4t" event={"ID":"74160f65-72f0-4ec1-87e6-58fda7e28896","Type":"ContainerDied","Data":"c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f"} Oct 06 14:10:21 crc kubenswrapper[4892]: I1006 14:10:21.539037 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpm4t" event={"ID":"74160f65-72f0-4ec1-87e6-58fda7e28896","Type":"ContainerStarted","Data":"930584a38246355ffafc34de142045420438c54ab2a230223f57fee9d736be8a"} Oct 06 14:10:22 crc kubenswrapper[4892]: I1006 14:10:22.572828 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpm4t" event={"ID":"74160f65-72f0-4ec1-87e6-58fda7e28896","Type":"ContainerStarted","Data":"20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd"} Oct 06 14:10:23 crc kubenswrapper[4892]: I1006 14:10:23.587170 4892 generic.go:334] "Generic (PLEG): container finished" podID="74160f65-72f0-4ec1-87e6-58fda7e28896" containerID="20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd" exitCode=0 Oct 06 14:10:23 crc kubenswrapper[4892]: I1006 14:10:23.587447 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpm4t" event={"ID":"74160f65-72f0-4ec1-87e6-58fda7e28896","Type":"ContainerDied","Data":"20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd"} Oct 06 14:10:24 crc kubenswrapper[4892]: I1006 14:10:24.598972 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpm4t" event={"ID":"74160f65-72f0-4ec1-87e6-58fda7e28896","Type":"ContainerStarted","Data":"f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9"} Oct 06 14:10:24 crc kubenswrapper[4892]: I1006 14:10:24.624063 4892 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qpm4t" podStartSLOduration=3.133534944 podStartE2EDuration="5.624046132s" podCreationTimestamp="2025-10-06 14:10:19 +0000 UTC" firstStartedPulling="2025-10-06 14:10:21.542163512 +0000 UTC m=+7308.091869277" lastFinishedPulling="2025-10-06 14:10:24.0326747 +0000 UTC m=+7310.582380465" observedRunningTime="2025-10-06 14:10:24.617308188 +0000 UTC m=+7311.167013953" watchObservedRunningTime="2025-10-06 14:10:24.624046132 +0000 UTC m=+7311.173751897" Oct 06 14:10:30 crc kubenswrapper[4892]: I1006 14:10:30.195912 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:30 crc kubenswrapper[4892]: I1006 14:10:30.196523 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:30 crc kubenswrapper[4892]: I1006 14:10:30.257425 4892 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:30 crc kubenswrapper[4892]: I1006 14:10:30.723900 4892 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:30 crc kubenswrapper[4892]: I1006 14:10:30.792193 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpm4t"] Oct 06 14:10:32 crc kubenswrapper[4892]: I1006 14:10:32.685144 4892 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qpm4t" podUID="74160f65-72f0-4ec1-87e6-58fda7e28896" containerName="registry-server" containerID="cri-o://f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9" gracePeriod=2 Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.238002 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.411615 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz45x\" (UniqueName: \"kubernetes.io/projected/74160f65-72f0-4ec1-87e6-58fda7e28896-kube-api-access-kz45x\") pod \"74160f65-72f0-4ec1-87e6-58fda7e28896\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.412103 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-utilities\") pod \"74160f65-72f0-4ec1-87e6-58fda7e28896\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.412268 4892 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-catalog-content\") pod \"74160f65-72f0-4ec1-87e6-58fda7e28896\" (UID: \"74160f65-72f0-4ec1-87e6-58fda7e28896\") " Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.427786 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-utilities" (OuterVolumeSpecName: "utilities") pod "74160f65-72f0-4ec1-87e6-58fda7e28896" (UID: "74160f65-72f0-4ec1-87e6-58fda7e28896"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.432405 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74160f65-72f0-4ec1-87e6-58fda7e28896-kube-api-access-kz45x" (OuterVolumeSpecName: "kube-api-access-kz45x") pod "74160f65-72f0-4ec1-87e6-58fda7e28896" (UID: "74160f65-72f0-4ec1-87e6-58fda7e28896"). InnerVolumeSpecName "kube-api-access-kz45x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.499189 4892 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74160f65-72f0-4ec1-87e6-58fda7e28896" (UID: "74160f65-72f0-4ec1-87e6-58fda7e28896"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.515868 4892 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-utilities\") on node \"crc\" DevicePath \"\"" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.516126 4892 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74160f65-72f0-4ec1-87e6-58fda7e28896-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.516418 4892 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz45x\" (UniqueName: \"kubernetes.io/projected/74160f65-72f0-4ec1-87e6-58fda7e28896-kube-api-access-kz45x\") on node \"crc\" DevicePath \"\"" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.696504 4892 generic.go:334] "Generic (PLEG): container finished" podID="74160f65-72f0-4ec1-87e6-58fda7e28896" containerID="f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9" exitCode=0 Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.696545 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpm4t" event={"ID":"74160f65-72f0-4ec1-87e6-58fda7e28896","Type":"ContainerDied","Data":"f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9"} Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.696550 4892 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpm4t" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.696569 4892 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpm4t" event={"ID":"74160f65-72f0-4ec1-87e6-58fda7e28896","Type":"ContainerDied","Data":"930584a38246355ffafc34de142045420438c54ab2a230223f57fee9d736be8a"} Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.696586 4892 scope.go:117] "RemoveContainer" containerID="f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.741703 4892 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpm4t"] Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.741814 4892 scope.go:117] "RemoveContainer" containerID="20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.753135 4892 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qpm4t"] Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.766690 4892 scope.go:117] "RemoveContainer" containerID="c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.842903 4892 scope.go:117] "RemoveContainer" containerID="f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9" Oct 06 14:10:33 crc kubenswrapper[4892]: E1006 14:10:33.843389 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9\": container with ID starting with f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9 not found: ID does not exist" containerID="f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.843433 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9"} err="failed to get container status \"f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9\": rpc error: code = NotFound desc = could not find container \"f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9\": container with ID starting with f2189f08b5cd839a9f6fb54dc74174e2ac7dd8aa98ab0bf75f6f7a37c93556a9 not found: ID does not exist" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.843461 4892 scope.go:117] "RemoveContainer" containerID="20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd" Oct 06 14:10:33 crc kubenswrapper[4892]: E1006 14:10:33.844011 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd\": container with ID starting with 20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd not found: ID does not exist" containerID="20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.844048 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd"} err="failed to get container status \"20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd\": rpc error: code = NotFound desc = could not find container \"20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd\": container with ID starting with 20c19e42df8b832f5e9483ab6ba1567527fb4804665391e42bf879a54203c6fd not found: ID does not exist" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.844066 4892 scope.go:117] "RemoveContainer" containerID="c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f" Oct 06 14:10:33 crc kubenswrapper[4892]: E1006 14:10:33.844350 4892 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f\": container with ID starting with c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f not found: ID does not exist" containerID="c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f" Oct 06 14:10:33 crc kubenswrapper[4892]: I1006 14:10:33.844375 4892 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f"} err="failed to get container status \"c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f\": rpc error: code = NotFound desc = could not find container \"c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f\": container with ID starting with c4bda25170fcb3829d23d48160a322dba237dea621d672df6fe270dcd9c99d8f not found: ID does not exist" Oct 06 14:10:34 crc kubenswrapper[4892]: I1006 14:10:34.180996 4892 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74160f65-72f0-4ec1-87e6-58fda7e28896" path="/var/lib/kubelet/pods/74160f65-72f0-4ec1-87e6-58fda7e28896/volumes" Oct 06 14:11:52 crc kubenswrapper[4892]: I1006 14:11:52.984046 4892 patch_prober.go:28] interesting pod/machine-config-daemon-4t26s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 06 14:11:52 crc kubenswrapper[4892]: I1006 14:11:52.984633 4892 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4t26s" podUID="f0107ee8-a9e2-4a14-b044-1c37a9df4d38" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"